Apr 24 23:51:20.259726 ip-10-0-133-214 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 23:51:20.259739 ip-10-0-133-214 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 23:51:20.259749 ip-10-0-133-214 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 23:51:20.260055 ip-10-0-133-214 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 23:51:30.319912 ip-10-0-133-214 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 23:51:30.319935 ip-10-0-133-214 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 93ef7272380844f686e180aef379fb3f -- Apr 24 23:53:41.277704 ip-10-0-133-214 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:41.715631 ip-10-0-133-214 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:41.715631 ip-10-0-133-214 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:41.715631 ip-10-0-133-214 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:41.715631 ip-10-0-133-214 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:41.715631 ip-10-0-133-214 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:41.718090 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.717998 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:41.720375 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720361 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:41.720375 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720376 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720380 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720383 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720387 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720390 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720392 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720395 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720411 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720415 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720422 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720425 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720428 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720431 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720434 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720438 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720442 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720445 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720448 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720451 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720454 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:41.720449 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720457 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720460 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720463 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720466 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720469 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720473 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720478 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720481 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720485 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720487 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720491 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720494 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720496 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720499 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720502 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720504 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720507 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720509 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720512 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720514 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:41.720930 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720517 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720519 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720522 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720524 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720527 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720529 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720532 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720534 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720536 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720539 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720541 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720544 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720546 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720549 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720557 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720560 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720563 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720565 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720568 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720570 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:41.721438 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720573 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720576 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720579 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720581 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720584 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720587 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720589 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720592 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720594 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720596 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720601 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720604 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720606 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720609 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720611 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720614 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720616 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720619 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720621 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720624 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:41.721974 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720626 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720629 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720631 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720634 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.720636 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721063 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721071 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721075 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721078 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721082 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721085 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721088 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721090 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721093 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721096 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721099 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721102 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721105 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721107 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721110 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:41.722471 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721113 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721116 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721120 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721122 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721125 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721128 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721132 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721136 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721140 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721143 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721146 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721148 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721151 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721153 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721156 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721158 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721161 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721163 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721166 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:41.722962 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721169 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721172 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721174 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721177 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721181 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721183 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721186 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721188 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721191 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721194 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721196 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721199 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721201 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721203 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721206 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721208 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721211 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721214 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721216 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:41.723462 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721218 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721221 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721223 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721226 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721228 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721231 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721234 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721236 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721239 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721242 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721244 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721248 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721251 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721253 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721256 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721258 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721261 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721263 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721266 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:41.723936 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721268 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721286 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721291 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721295 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721299 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721301 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721304 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721307 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721309 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721312 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721314 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721317 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721320 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.721323 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722813 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722829 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722836 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722841 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722846 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722850 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722854 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:41.724409 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722859 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722866 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722869 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722873 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722878 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722881 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722885 2569 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722888 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722891 2569 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722894 2569 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722897 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722899 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722904 2569 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722907 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722910 2569 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722913 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722917 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722921 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722924 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722927 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722931 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722933 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722937 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722941 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722944 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:41.724913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722947 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722951 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722955 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722958 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722961 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722964 2569 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722967 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722972 2569 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722976 2569 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722979 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722982 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722986 2569 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722990 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722993 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722996 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.722999 2569 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723002 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723005 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723008 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723011 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723014 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723016 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723019 2569 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723023 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723026 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:41.725574 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723029 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723033 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723036 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723039 2569 flags.go:64] FLAG: --help="false" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723042 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-133-214.ec2.internal" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723045 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723049 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723051 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723055 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723058 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723061 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723064 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723067 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723069 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723072 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723076 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723079 2569 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723082 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723085 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723088 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723091 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723094 2569 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723097 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723100 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:41.726184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723103 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723109 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723112 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723115 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723118 2569 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723121 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723124 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723127 2569 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723130 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723135 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723138 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723142 2569 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723145 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723148 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723151 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723154 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723158 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723161 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723164 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723171 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723174 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723177 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723181 2569 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:41.726825 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723184 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723189 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723192 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723195 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723198 2569 flags.go:64] FLAG: --port="10250" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723202 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723211 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-038be303c54b57a9b" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723215 2569 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723218 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723221 2569 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723224 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723227 2569 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723235 2569 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723238 2569 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723241 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723243 2569 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723247 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723250 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723253 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723256 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723259 2569 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723262 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723265 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723268 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723271 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723275 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:41.727384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723278 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723281 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723284 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723287 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723290 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723293 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723295 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723299 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723302 2569 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723305 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723310 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723313 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723316 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723320 2569 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723323 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723325 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723328 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723331 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723334 2569 flags.go:64] FLAG: --v="2" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723339 2569 flags.go:64] FLAG: --version="false" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723343 2569 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723347 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.723351 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723458 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:41.728022 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723463 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723466 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723469 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723472 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723474 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723477 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723480 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723482 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723486 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723488 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723491 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723493 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723496 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723498 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723501 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723505 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723507 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723512 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723514 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:41.728606 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723519 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723522 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723526 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723529 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723532 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723534 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723537 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723539 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723542 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723545 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723547 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723550 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723553 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723555 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723558 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723561 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723563 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723566 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723569 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723571 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:41.729093 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723574 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723577 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723580 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723583 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723585 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723588 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723590 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723593 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723597 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723599 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723603 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723607 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723610 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723613 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723616 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723619 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723622 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723624 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723628 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:41.729615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723630 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723633 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723636 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723638 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723641 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723643 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723646 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723648 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723651 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723653 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723656 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723659 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723661 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723664 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723666 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723668 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723672 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723674 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723677 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723679 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:41.730242 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723682 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723686 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723688 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723692 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723694 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723697 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.723700 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:41.731048 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.724507 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:41.732086 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.732068 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:41.732086 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.732086 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:41.732153 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732137 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:41.732153 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732142 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:41.732153 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732146 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:41.732153 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732149 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:41.732153 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732152 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732156 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732161 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732164 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732167 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732170 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732173 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732175 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732178 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732181 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732183 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732186 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732189 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732191 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732194 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732196 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732199 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732202 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732204 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:41.732281 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732207 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732210 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732213 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732216 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732218 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732221 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732223 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732226 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732228 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732230 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732233 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732236 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732238 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732241 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732243 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732247 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732250 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732253 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732255 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732258 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:41.732770 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732261 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732265 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732269 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732272 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732274 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732277 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732280 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732282 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732285 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732288 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732290 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732293 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732296 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732298 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732301 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732304 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732306 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732309 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732311 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:41.733252 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732314 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732316 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732319 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732322 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732324 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732327 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732329 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732332 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732335 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732338 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732340 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732343 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732345 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732348 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732351 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732354 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732356 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732359 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732361 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732364 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:41.733733 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732367 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732369 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732372 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732374 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.732379 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732509 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732515 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732518 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732520 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732523 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732526 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732528 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732531 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732534 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732536 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732539 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:41.734227 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732541 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732544 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732547 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732550 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732552 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732555 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732557 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732560 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732562 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732565 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732568 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732571 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732574 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732577 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732580 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732583 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732585 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732588 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732590 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732592 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:41.734640 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732595 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732597 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732600 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732602 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732605 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732607 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732610 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732612 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732614 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732617 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732620 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732622 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732624 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732627 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732629 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732632 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732635 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732637 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732641 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732644 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:41.735116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732647 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732650 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732653 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732656 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732659 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732662 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732664 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732667 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732669 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732672 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732674 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732677 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732679 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732682 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732684 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732688 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732692 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732694 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732697 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:41.735615 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732699 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732701 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732704 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732707 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732710 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732712 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732715 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732717 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732720 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732725 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732727 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732730 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732733 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732735 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732737 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:41.732740 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:41.736087 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.732744 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:41.736493 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.733421 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:41.738287 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.738274 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:41.739304 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.739293 2569 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:41.739418 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.739387 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:41.739449 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.739438 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:41.771725 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.771705 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:41.775117 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.775094 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:41.791909 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.791883 2569 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:41.797910 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.797892 2569 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:41.798851 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.798834 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:41.799141 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.799124 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:41.804188 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.804167 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c28c86d5-429b-4ed1-a1db-ffe59d5be4bf:/dev/nvme0n1p4 f3d782fa-89ce-48ec-b9dd-48afad4cc980:/dev/nvme0n1p3] Apr 24 23:53:41.804249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.804187 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:41.810753 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.810546 2569 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:41.807826387 +0000 UTC m=+0.412552995 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099591 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b75bc343e1728c795fd41b5522c04 SystemUUID:ec2b75bc-343e-1728-c795-fd41b5522c04 BootID:93ef7272-3808-44f6-86e1-80aef379fb3f Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ef:bc:70:71:9d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ef:bc:70:71:9d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:9d:ba:ef:fb:75 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:41.810753 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.810748 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:41.810862 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.810821 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:41.811909 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.811885 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:41.812072 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.811914 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-214.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:41.812152 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.812087 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:41.812152 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.812100 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:41.812152 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.812118 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:41.813026 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.813015 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:41.814571 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.814559 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:41.814701 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.814690 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:41.817122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.817101 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:41.817122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.817127 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:41.817230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.817142 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:41.817230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.817156 2569 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:41.817230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.817168 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:41.818333 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.818320 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:41.818396 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.818343 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:41.818708 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.818691 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8jnq6" Apr 24 23:53:41.821845 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.821819 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:41.823682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.823667 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:41.824682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.824667 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8jnq6" Apr 24 23:53:41.824973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.824960 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.824981 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.824990 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825000 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825009 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825020 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825030 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:41.825044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825042 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:41.825274 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825056 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:41.825274 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825067 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:41.825274 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825089 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:41.825274 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.825103 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:41.826516 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.826497 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-214.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:41.826556 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.826498 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:41.826819 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.826807 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:41.826868 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.826823 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:41.830221 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.830206 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:41.830298 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.830248 2569 server.go:1295] "Started kubelet" Apr 24 23:53:41.830346 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.830327 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:41.830439 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.830376 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:41.830490 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.830457 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:41.831074 ip-10-0-133-214 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:41.831590 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.831553 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:41.832956 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.832938 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:41.836886 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.836867 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:41.837560 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.837536 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:41.839652 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.839192 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:41.839652 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.839315 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:41.839652 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.839447 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:41.839652 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.839218 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:41.839869 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.839701 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:41.839869 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.839714 2569 factory.go:55] Registering systemd factory Apr 24 23:53:41.839869 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.839724 2569 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:41.840154 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.840139 2569 factory.go:153] Registering CRI-O factory Apr 24 23:53:41.840154 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.840155 2569 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:41.840248 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.840178 2569 factory.go:103] Registering Raw factory Apr 24 23:53:41.840248 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.840192 2569 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:41.840344 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.840262 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:41.841267 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.841235 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:41.841372 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.841268 2569 manager.go:319] Starting recovery of all containers Apr 24 23:53:41.841656 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.841643 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:41.841785 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.841775 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:41.846071 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.846051 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-214.ec2.internal" not found Apr 24 23:53:41.846376 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.846356 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-214.ec2.internal\" not found" node="ip-10-0-133-214.ec2.internal" Apr 24 23:53:41.853563 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.853548 2569 manager.go:324] Recovery completed Apr 24 23:53:41.858003 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.857990 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.860370 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.860356 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.860456 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.860385 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.860456 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.860398 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.860891 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.860877 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:41.860891 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.860889 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:41.860973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.860905 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:41.862327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.862314 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-214.ec2.internal" not found Apr 24 23:53:41.863073 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.863063 2569 policy_none.go:49] "None policy: Start" Apr 24 23:53:41.863116 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.863078 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:41.863116 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.863087 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:41.896989 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.896976 2569 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.897010 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.897019 2569 server.go:85] "Starting device plugin registration server" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.897236 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.897248 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.897323 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.897420 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.897432 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.898105 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:41.903846 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.898142 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:41.926627 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.926608 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-214.ec2.internal" not found Apr 24 23:53:41.970675 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.970585 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:41.971907 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.971889 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:41.971999 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.971918 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:41.972057 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.972001 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:41.972057 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.972012 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:41.972147 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:41.972056 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:41.974502 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.974485 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:41.998056 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.998038 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:41.998999 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.998982 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:41.999073 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.999012 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:41.999073 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.999027 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:41.999073 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:41.999055 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.006167 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.006153 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.006214 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.006173 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-214.ec2.internal\": node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.026151 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.026132 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.072722 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.072664 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal"] Apr 24 23:53:42.072820 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.072770 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:42.074433 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.074395 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:42.074498 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.074451 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:42.074498 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.074467 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:42.075674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.075661 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:42.075845 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.075827 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.075891 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.075860 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:42.076435 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.076419 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:42.076435 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.076432 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:42.076530 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.076446 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:42.076530 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.076450 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:42.076530 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.076457 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:42.076530 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.076459 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:42.077561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.077549 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.077613 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.077572 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:42.078536 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.078463 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:42.078536 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.078510 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:42.078536 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.078527 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:42.093995 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.093973 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-214.ec2.internal\" not found" node="ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.097238 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.097221 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-214.ec2.internal\" not found" node="ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.126534 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.126513 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.143762 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.143741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/20c8b549d04c6a6f4001cebf1b8f071a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal\" (UID: \"20c8b549d04c6a6f4001cebf1b8f071a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.143829 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.143770 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20c8b549d04c6a6f4001cebf1b8f071a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal\" (UID: \"20c8b549d04c6a6f4001cebf1b8f071a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.143829 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.143788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/413e1dcc7a9acaef383b6c159ccd3bed-config\") pod \"kube-apiserver-proxy-ip-10-0-133-214.ec2.internal\" (UID: \"413e1dcc7a9acaef383b6c159ccd3bed\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.226965 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.226896 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.244375 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.244353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/20c8b549d04c6a6f4001cebf1b8f071a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal\" (UID: \"20c8b549d04c6a6f4001cebf1b8f071a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.244472 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.244375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/20c8b549d04c6a6f4001cebf1b8f071a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal\" (UID: \"20c8b549d04c6a6f4001cebf1b8f071a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.244472 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.244396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20c8b549d04c6a6f4001cebf1b8f071a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal\" (UID: \"20c8b549d04c6a6f4001cebf1b8f071a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.244472 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.244442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/413e1dcc7a9acaef383b6c159ccd3bed-config\") pod \"kube-apiserver-proxy-ip-10-0-133-214.ec2.internal\" (UID: \"413e1dcc7a9acaef383b6c159ccd3bed\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.244472 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.244469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/413e1dcc7a9acaef383b6c159ccd3bed-config\") pod \"kube-apiserver-proxy-ip-10-0-133-214.ec2.internal\" (UID: \"413e1dcc7a9acaef383b6c159ccd3bed\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.244602 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.244514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20c8b549d04c6a6f4001cebf1b8f071a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal\" (UID: \"20c8b549d04c6a6f4001cebf1b8f071a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.327805 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.327770 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.396395 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.396358 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.400027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.400005 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.428800 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.428771 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.529432 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.529333 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.629885 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.629860 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.730563 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.730535 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.738705 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.738685 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:42.738839 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.738823 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:42.738901 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.738862 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:42.826562 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.826526 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:41 +0000 UTC" deadline="2027-11-07 17:57:47.06024067 +0000 UTC" Apr 24 23:53:42.826562 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.826558 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13482h4m4.233684708s" Apr 24 23:53:42.830651 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:42.830625 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-214.ec2.internal\" not found" Apr 24 23:53:42.837785 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.837762 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:42.858674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.858650 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:42.866355 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.866338 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:42.880322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.880301 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zn45k" Apr 24 23:53:42.888221 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.888206 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zn45k" Apr 24 23:53:42.932443 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:42.932389 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c8b549d04c6a6f4001cebf1b8f071a.slice/crio-c3c37ef21c1237e7b21163fd20ed290cab28cb93ae715da8b7b9df5c53cbed0d WatchSource:0}: Error finding container c3c37ef21c1237e7b21163fd20ed290cab28cb93ae715da8b7b9df5c53cbed0d: Status 404 returned error can't find the container with id c3c37ef21c1237e7b21163fd20ed290cab28cb93ae715da8b7b9df5c53cbed0d Apr 24 23:53:42.933935 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:42.933914 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod413e1dcc7a9acaef383b6c159ccd3bed.slice/crio-1b994411690330ec1ce7b01d2050b9bc21396b7e2ce8cd450247fd94aa871ed7 WatchSource:0}: Error finding container 1b994411690330ec1ce7b01d2050b9bc21396b7e2ce8cd450247fd94aa871ed7: Status 404 returned error can't find the container with id 1b994411690330ec1ce7b01d2050b9bc21396b7e2ce8cd450247fd94aa871ed7 Apr 24 23:53:42.937664 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.937645 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:42.937748 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.937733 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.948209 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.948192 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:42.950548 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.950533 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" Apr 24 23:53:42.963032 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.963012 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:42.974905 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.974860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" event={"ID":"413e1dcc7a9acaef383b6c159ccd3bed","Type":"ContainerStarted","Data":"1b994411690330ec1ce7b01d2050b9bc21396b7e2ce8cd450247fd94aa871ed7"} Apr 24 23:53:42.975714 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:42.975697 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" event={"ID":"20c8b549d04c6a6f4001cebf1b8f071a","Type":"ContainerStarted","Data":"c3c37ef21c1237e7b21163fd20ed290cab28cb93ae715da8b7b9df5c53cbed0d"} Apr 24 23:53:43.162558 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.162384 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:43.585648 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.585566 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:43.659273 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.659247 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:43.818623 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.818590 2569 apiserver.go:52] "Watching apiserver" Apr 24 23:53:43.827491 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.827471 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:43.829900 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.829875 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2r7xx","openshift-cluster-node-tuning-operator/tuned-4lz5c","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal","openshift-multus/multus-additional-cni-plugins-gv8rj","openshift-multus/multus-pk5cg","openshift-multus/network-metrics-daemon-npwg7","openshift-network-diagnostics/network-check-target-lsbf2","openshift-network-operator/iptables-alerter-5s57p","kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx","openshift-image-registry/node-ca-rsd5n","openshift-ovn-kubernetes/ovnkube-node-xnqmd"] Apr 24 23:53:43.834866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.834845 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.834980 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.834932 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.837172 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.837055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.837575 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.837392 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2b6p8\"" Apr 24 23:53:43.837575 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.837473 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:43.837575 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.837525 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dzpsh\"" Apr 24 23:53:43.838075 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.838048 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:43.840180 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.838740 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:43.840180 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.838959 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:43.840180 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.839093 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f7ts4\"" Apr 24 23:53:43.840180 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.839236 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:43.840180 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.839970 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:43.841413 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.841138 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:43.841413 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.841154 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:43.841575 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.841371 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:43.842307 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.841949 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.842307 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.842097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:43.842307 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:43.842199 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:43.844424 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.844007 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-56vnv\"" Apr 24 23:53:43.844424 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.844007 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:43.844424 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.844343 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:43.844589 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:43.844448 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:43.846665 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.846649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:43.848725 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.848702 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:43.848821 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.848788 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-g77b2\"" Apr 24 23:53:43.849065 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.849046 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:43.849146 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.849112 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:43.849429 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.849414 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.852104 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.851273 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:43.852104 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.851357 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k2d9m\"" Apr 24 23:53:43.852104 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.851364 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:43.852104 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.851519 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:43.853486 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853467 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysctl-conf\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.853582 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-run\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.853582 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853521 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-hostroot\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.853582 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853541 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:43.853582 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-tuned\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04a809d7-5a4b-4d1b-b069-41b0cd06e320-cni-binary-copy\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-netns\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-system-cni-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-lib-modules\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-host\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-system-cni-dir\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.853780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853778 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cnibin\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853794 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-etc-kubernetes\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853808 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc44c\" (UniqueName: \"kubernetes.io/projected/04a809d7-5a4b-4d1b-b069-41b0cd06e320-kube-api-access-gc44c\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqr2s\" (UniqueName: \"kubernetes.io/projected/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-kube-api-access-cqr2s\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-systemd\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-cni-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-k8s-cni-cncf-io\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-cni-bin\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-cni-multus\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.853960 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-kubelet\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854000 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a9b41727-9688-4980-8671-a860f9ccf954-agent-certs\") pod \"konnectivity-agent-2r7xx\" (UID: \"a9b41727-9688-4980-8671-a860f9ccf954\") " pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-os-release\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854039 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-conf-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854075 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-multus-certs\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddb0ca40-9985-4e85-85cc-14ecd836b337-tmp\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-socket-dir-parent\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854151 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854169 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysconfig\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854185 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysctl-d\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-var-lib-kubelet\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlmn\" (UniqueName: \"kubernetes.io/projected/ddb0ca40-9985-4e85-85cc-14ecd836b337-kube-api-access-9tlmn\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854319 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-os-release\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854349 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7b9\" (UniqueName: \"kubernetes.io/projected/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-kube-api-access-5x7b9\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854416 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a9b41727-9688-4980-8671-a860f9ccf954-konnectivity-ca\") pod \"konnectivity-agent-2r7xx\" (UID: \"a9b41727-9688-4980-8671-a860f9ccf954\") " pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854495 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-kubernetes\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-sys\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-cnibin\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-daemon-config\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.854866 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.854624 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-modprobe-d\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.856561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.856368 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:43.856561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.856398 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:43.856561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.856449 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:43.856561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.856473 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:43.856561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.856428 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-w8c42\"" Apr 24 23:53:43.857349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.857070 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r5w2g\"" Apr 24 23:53:43.857349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.857120 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:43.857349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.857122 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:43.857349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.857163 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:43.857349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.857296 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:43.857738 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.857578 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:43.889191 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.889155 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:42 +0000 UTC" deadline="2028-01-12 12:57:55.766186062 +0000 UTC" Apr 24 23:53:43.889191 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.889182 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15061h4m11.877008269s" Apr 24 23:53:43.941286 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.941252 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:43.954822 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a9b41727-9688-4980-8671-a860f9ccf954-konnectivity-ca\") pod \"konnectivity-agent-2r7xx\" (UID: \"a9b41727-9688-4980-8671-a860f9ccf954\") " pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954826 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-sys\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-cnibin\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954882 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-cni-bin\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysctl-conf\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954924 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-hostroot\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-sys\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.954960 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-cnibin\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954960 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-slash\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954965 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-hostroot\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.954997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92d3f68f-64da-4914-9d0e-66109d7ac351-iptables-alerter-script\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnwrs\" (UniqueName: \"kubernetes.io/projected/92d3f68f-64da-4914-9d0e-66109d7ac351-kube-api-access-nnwrs\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-tuned\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955067 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd9e66f5-07c5-45ce-92ad-16649e745d96-serviceca\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955066 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysctl-conf\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-system-cni-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cnibin\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955134 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc44c\" (UniqueName: \"kubernetes.io/projected/04a809d7-5a4b-4d1b-b069-41b0cd06e320-kube-api-access-gc44c\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955159 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr2s\" (UniqueName: \"kubernetes.io/projected/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-kube-api-access-cqr2s\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-system-cni-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cnibin\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-systemd-units\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955216 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-run-netns\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955231 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-log-socket\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.955332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-env-overrides\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-cni-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-cni-multus\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-cni-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955350 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-ovnkube-config\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a9b41727-9688-4980-8671-a860f9ccf954-konnectivity-ca\") pod \"konnectivity-agent-2r7xx\" (UID: \"a9b41727-9688-4980-8671-a860f9ccf954\") " pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955496 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-cni-multus\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a9b41727-9688-4980-8671-a860f9ccf954-agent-certs\") pod \"konnectivity-agent-2r7xx\" (UID: \"a9b41727-9688-4980-8671-a860f9ccf954\") " pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-conf-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-multus-certs\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-socket-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955695 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-multus-certs\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955716 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddb0ca40-9985-4e85-85cc-14ecd836b337-tmp\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-conf-dir\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-etc-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.956027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955883 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysconfig\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.955969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysconfig\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-os-release\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-var-lib-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-registration-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-os-release\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7b9\" (UniqueName: \"kubernetes.io/projected/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-kube-api-access-5x7b9\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956193 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-kubernetes\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-daemon-config\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-sys-fs\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-kubernetes\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dd9e66f5-07c5-45ce-92ad-16649e745d96-host\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-modprobe-d\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.956831 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-run\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-ovnkube-script-lib\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04a809d7-5a4b-4d1b-b069-41b0cd06e320-cni-binary-copy\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956518 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-netns\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-cni-netd\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956573 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-netns\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956639 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-lib-modules\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956667 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-host\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-modprobe-d\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-system-cni-dir\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-run\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956757 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-etc-kubernetes\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-systemd\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956811 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-k8s-cni-cncf-io\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.957682 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:43.956818 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-cni-bin\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-daemon-config\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-kubelet\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-etc-kubernetes\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:43.956957 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.456913202 +0000 UTC m=+3.061639782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.956982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-ovn\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957009 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsgz5\" (UniqueName: \"kubernetes.io/projected/0e517da2-c437-430c-aec1-02e2d22665ca-kube-api-access-nsgz5\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-lib-modules\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957082 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-systemd\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-cni-bin\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-var-lib-kubelet\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957159 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-host\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-system-cni-dir\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957226 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t992g\" (UniqueName: \"kubernetes.io/projected/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-kube-api-access-t992g\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-host-run-k8s-cni-cncf-io\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958271 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-os-release\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-systemd\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957349 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-os-release\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957392 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-device-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tn2v\" (UniqueName: \"kubernetes.io/projected/dd9e66f5-07c5-45ce-92ad-16649e745d96-kube-api-access-8tn2v\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-socket-dir-parent\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957551 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-kubelet\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e517da2-c437-430c-aec1-02e2d22665ca-ovn-node-metrics-cert\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92d3f68f-64da-4914-9d0e-66109d7ac351-host-slash\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-node-log\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04a809d7-5a4b-4d1b-b069-41b0cd06e320-cni-binary-copy\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957728 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysctl-d\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.958805 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957766 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-var-lib-kubelet\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04a809d7-5a4b-4d1b-b069-41b0cd06e320-multus-socket-dir-parent\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957813 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-var-lib-kubelet\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-sysctl-d\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.957839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlmn\" (UniqueName: \"kubernetes.io/projected/ddb0ca40-9985-4e85-85cc-14ecd836b337-kube-api-access-9tlmn\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.958976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddb0ca40-9985-4e85-85cc-14ecd836b337-tmp\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.958976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ddb0ca40-9985-4e85-85cc-14ecd836b337-etc-tuned\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:43.959458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.959027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a9b41727-9688-4980-8671-a860f9ccf954-agent-certs\") pod \"konnectivity-agent-2r7xx\" (UID: \"a9b41727-9688-4980-8671-a860f9ccf954\") " pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:43.963605 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.963580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr2s\" (UniqueName: \"kubernetes.io/projected/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-kube-api-access-cqr2s\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:43.963730 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.963674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc44c\" (UniqueName: \"kubernetes.io/projected/04a809d7-5a4b-4d1b-b069-41b0cd06e320-kube-api-access-gc44c\") pod \"multus-pk5cg\" (UID: \"04a809d7-5a4b-4d1b-b069-41b0cd06e320\") " pod="openshift-multus/multus-pk5cg" Apr 24 23:53:43.963791 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.963770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7b9\" (UniqueName: \"kubernetes.io/projected/cfab2e9e-eb84-4b70-bc59-197bc3f27fb6-kube-api-access-5x7b9\") pod \"multus-additional-cni-plugins-gv8rj\" (UID: \"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6\") " pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:43.964918 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:43.964891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlmn\" (UniqueName: \"kubernetes.io/projected/ddb0ca40-9985-4e85-85cc-14ecd836b337-kube-api-access-9tlmn\") pod \"tuned-4lz5c\" (UID: \"ddb0ca40-9985-4e85-85cc-14ecd836b337\") " pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:44.059038 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-socket-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-etc-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059116 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-var-lib-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059141 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-registration-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-socket-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-etc-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059185 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059222 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-var-lib-openvswitch\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-sys-fs\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-registration-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dd9e66f5-07c5-45ce-92ad-16649e745d96-host\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dd9e66f5-07c5-45ce-92ad-16649e745d96-host\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059320 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-ovnkube-script-lib\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059277 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-sys-fs\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-cni-netd\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059397 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-cni-netd\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-ovn\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsgz5\" (UniqueName: \"kubernetes.io/projected/0e517da2-c437-430c-aec1-02e2d22665ca-kube-api-access-nsgz5\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t992g\" (UniqueName: \"kubernetes.io/projected/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-kube-api-access-t992g\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059518 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-systemd\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.059584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-device-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tn2v\" (UniqueName: \"kubernetes.io/projected/dd9e66f5-07c5-45ce-92ad-16649e745d96-kube-api-access-8tn2v\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-kubelet\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e517da2-c437-430c-aec1-02e2d22665ca-ovn-node-metrics-cert\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-systemd\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059695 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92d3f68f-64da-4914-9d0e-66109d7ac351-host-slash\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-node-log\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059779 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-cni-bin\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059842 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-cni-bin\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-slash\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059876 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-slash\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059881 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-device-dir\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92d3f68f-64da-4914-9d0e-66109d7ac351-iptables-alerter-script\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.060322 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnwrs\" (UniqueName: \"kubernetes.io/projected/92d3f68f-64da-4914-9d0e-66109d7ac351-kube-api-access-nnwrs\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-kubelet\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd9e66f5-07c5-45ce-92ad-16649e745d96-serviceca\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-systemd-units\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.059996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-run-netns\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060026 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-ovnkube-script-lib\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-log-socket\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-host-run-netns\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-log-socket\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060103 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-run-ovn\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-env-overrides\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92d3f68f-64da-4914-9d0e-66109d7ac351-host-slash\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060161 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-ovnkube-config\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060284 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-node-log\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92d3f68f-64da-4914-9d0e-66109d7ac351-iptables-alerter-script\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060465 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e517da2-c437-430c-aec1-02e2d22665ca-systemd-units\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd9e66f5-07c5-45ce-92ad-16649e745d96-serviceca\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.061110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-ovnkube-config\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.061940 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.060670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e517da2-c437-430c-aec1-02e2d22665ca-env-overrides\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.062889 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.062862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e517da2-c437-430c-aec1-02e2d22665ca-ovn-node-metrics-cert\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.066579 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.066325 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:44.066579 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.066349 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:44.066579 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.066363 2569 projected.go:194] Error preparing data for projected volume kube-api-access-bmgwg for pod openshift-network-diagnostics/network-check-target-lsbf2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.066579 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.066450 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg podName:44764fbc-9742-4d96-ae9e-d45956e60888 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.566430333 +0000 UTC m=+3.171156912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bmgwg" (UniqueName: "kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg") pod "network-check-target-lsbf2" (UID: "44764fbc-9742-4d96-ae9e-d45956e60888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.067858 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.067832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tn2v\" (UniqueName: \"kubernetes.io/projected/dd9e66f5-07c5-45ce-92ad-16649e745d96-kube-api-access-8tn2v\") pod \"node-ca-rsd5n\" (UID: \"dd9e66f5-07c5-45ce-92ad-16649e745d96\") " pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.068027 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.068010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t992g\" (UniqueName: \"kubernetes.io/projected/92a57ac1-cd55-47a0-a17d-cf953d38b2ea-kube-api-access-t992g\") pod \"aws-ebs-csi-driver-node-dzjgx\" (UID: \"92a57ac1-cd55-47a0-a17d-cf953d38b2ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.068365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.068347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnwrs\" (UniqueName: \"kubernetes.io/projected/92d3f68f-64da-4914-9d0e-66109d7ac351-kube-api-access-nnwrs\") pod \"iptables-alerter-5s57p\" (UID: \"92d3f68f-64da-4914-9d0e-66109d7ac351\") " pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.068563 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.068548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsgz5\" (UniqueName: \"kubernetes.io/projected/0e517da2-c437-430c-aec1-02e2d22665ca-kube-api-access-nsgz5\") pod \"ovnkube-node-xnqmd\" (UID: \"0e517da2-c437-430c-aec1-02e2d22665ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.147747 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.147659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:53:44.156140 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.156118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" Apr 24 23:53:44.163858 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.163831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" Apr 24 23:53:44.169414 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.169385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pk5cg" Apr 24 23:53:44.177950 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.177929 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5s57p" Apr 24 23:53:44.185507 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.185488 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" Apr 24 23:53:44.191974 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.191953 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rsd5n" Apr 24 23:53:44.198561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.198542 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:53:44.463784 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.463651 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:44.463946 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.463787 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.463946 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.463899 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:53:45.463878411 +0000 UTC m=+4.068604998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.539060 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.539018 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a809d7_5a4b_4d1b_b069_41b0cd06e320.slice/crio-1b973c9403714232edfba97a60809e8c0e5bacda05b6721c7a930e5dabfa8874 WatchSource:0}: Error finding container 1b973c9403714232edfba97a60809e8c0e5bacda05b6721c7a930e5dabfa8874: Status 404 returned error can't find the container with id 1b973c9403714232edfba97a60809e8c0e5bacda05b6721c7a930e5dabfa8874 Apr 24 23:53:44.540339 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.540316 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb0ca40_9985_4e85_85cc_14ecd836b337.slice/crio-f2081b777c266c132a2c4c174529aa4c64b9e6160db83111cc136e4091974620 WatchSource:0}: Error finding container f2081b777c266c132a2c4c174529aa4c64b9e6160db83111cc136e4091974620: Status 404 returned error can't find the container with id f2081b777c266c132a2c4c174529aa4c64b9e6160db83111cc136e4091974620 Apr 24 23:53:44.545434 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.545389 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d3f68f_64da_4914_9d0e_66109d7ac351.slice/crio-22bf322a311638cf849125fd5ac3c68814e196a12d297c39a284356128857f33 WatchSource:0}: Error finding container 22bf322a311638cf849125fd5ac3c68814e196a12d297c39a284356128857f33: Status 404 returned error can't find the container with id 22bf322a311638cf849125fd5ac3c68814e196a12d297c39a284356128857f33 Apr 24 23:53:44.545699 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.545674 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd9e66f5_07c5_45ce_92ad_16649e745d96.slice/crio-c892d4a7342b8beea9fd233b127f38b9cd0a2cadfcfb81df1d4172023eaa0cc6 WatchSource:0}: Error finding container c892d4a7342b8beea9fd233b127f38b9cd0a2cadfcfb81df1d4172023eaa0cc6: Status 404 returned error can't find the container with id c892d4a7342b8beea9fd233b127f38b9cd0a2cadfcfb81df1d4172023eaa0cc6 Apr 24 23:53:44.546356 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.546241 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9b41727_9688_4980_8671_a860f9ccf954.slice/crio-9748816c792236ecbf1e7f3ed8fdbaa288cb44ffcaa63ee2657f53179767c411 WatchSource:0}: Error finding container 9748816c792236ecbf1e7f3ed8fdbaa288cb44ffcaa63ee2657f53179767c411: Status 404 returned error can't find the container with id 9748816c792236ecbf1e7f3ed8fdbaa288cb44ffcaa63ee2657f53179767c411 Apr 24 23:53:44.547565 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.547475 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfab2e9e_eb84_4b70_bc59_197bc3f27fb6.slice/crio-fa0cfa4a9afe4b76f3b1b3dbd0f39b071f60b9f116e9b2b7f16bde506e175b43 WatchSource:0}: Error finding container fa0cfa4a9afe4b76f3b1b3dbd0f39b071f60b9f116e9b2b7f16bde506e175b43: Status 404 returned error can't find the container with id fa0cfa4a9afe4b76f3b1b3dbd0f39b071f60b9f116e9b2b7f16bde506e175b43 Apr 24 23:53:44.548214 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.548063 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e517da2_c437_430c_aec1_02e2d22665ca.slice/crio-c7ace4b205ec4e1bb86ea78655ad8c83b334b84fb82d0a72451145dbd30b30fe WatchSource:0}: Error finding container c7ace4b205ec4e1bb86ea78655ad8c83b334b84fb82d0a72451145dbd30b30fe: Status 404 returned error can't find the container with id c7ace4b205ec4e1bb86ea78655ad8c83b334b84fb82d0a72451145dbd30b30fe Apr 24 23:53:44.550295 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:53:44.550247 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92a57ac1_cd55_47a0_a17d_cf953d38b2ea.slice/crio-10bf3b68f902d9ace0ed875846a4c5759b78841c30e63e7341c5273371b90e8e WatchSource:0}: Error finding container 10bf3b68f902d9ace0ed875846a4c5759b78841c30e63e7341c5273371b90e8e: Status 404 returned error can't find the container with id 10bf3b68f902d9ace0ed875846a4c5759b78841c30e63e7341c5273371b90e8e Apr 24 23:53:44.665670 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.665645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:44.665784 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.665759 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:44.665784 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.665771 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:44.665784 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.665780 2569 projected.go:194] Error preparing data for projected volume kube-api-access-bmgwg for pod openshift-network-diagnostics/network-check-target-lsbf2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.665879 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:44.665820 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg podName:44764fbc-9742-4d96-ae9e-d45956e60888 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:45.665808152 +0000 UTC m=+4.270534729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bmgwg" (UniqueName: "kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg") pod "network-check-target-lsbf2" (UID: "44764fbc-9742-4d96-ae9e-d45956e60888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.889454 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.889348 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:42 +0000 UTC" deadline="2028-01-28 17:54:55.22996631 +0000 UTC" Apr 24 23:53:44.889454 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.889388 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15450h1m10.340581888s" Apr 24 23:53:44.989938 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.989881 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" event={"ID":"413e1dcc7a9acaef383b6c159ccd3bed","Type":"ContainerStarted","Data":"661d2863326b550761babae66ac35dd272cdf3e864a097096da7f30338ac5ff7"} Apr 24 23:53:44.992707 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.992611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"c7ace4b205ec4e1bb86ea78655ad8c83b334b84fb82d0a72451145dbd30b30fe"} Apr 24 23:53:44.996943 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.996892 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerStarted","Data":"fa0cfa4a9afe4b76f3b1b3dbd0f39b071f60b9f116e9b2b7f16bde506e175b43"} Apr 24 23:53:44.998689 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:44.998634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rsd5n" event={"ID":"dd9e66f5-07c5-45ce-92ad-16649e745d96","Type":"ContainerStarted","Data":"c892d4a7342b8beea9fd233b127f38b9cd0a2cadfcfb81df1d4172023eaa0cc6"} Apr 24 23:53:45.004289 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.003637 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-214.ec2.internal" podStartSLOduration=3.003623118 podStartE2EDuration="3.003623118s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:45.00317937 +0000 UTC m=+3.607905969" watchObservedRunningTime="2026-04-24 23:53:45.003623118 +0000 UTC m=+3.608349714" Apr 24 23:53:45.006193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.006153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" event={"ID":"ddb0ca40-9985-4e85-85cc-14ecd836b337","Type":"ContainerStarted","Data":"f2081b777c266c132a2c4c174529aa4c64b9e6160db83111cc136e4091974620"} Apr 24 23:53:45.008660 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.008594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pk5cg" event={"ID":"04a809d7-5a4b-4d1b-b069-41b0cd06e320","Type":"ContainerStarted","Data":"1b973c9403714232edfba97a60809e8c0e5bacda05b6721c7a930e5dabfa8874"} Apr 24 23:53:45.014009 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.013965 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" event={"ID":"92a57ac1-cd55-47a0-a17d-cf953d38b2ea","Type":"ContainerStarted","Data":"10bf3b68f902d9ace0ed875846a4c5759b78841c30e63e7341c5273371b90e8e"} Apr 24 23:53:45.015539 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.015463 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2r7xx" event={"ID":"a9b41727-9688-4980-8671-a860f9ccf954","Type":"ContainerStarted","Data":"9748816c792236ecbf1e7f3ed8fdbaa288cb44ffcaa63ee2657f53179767c411"} Apr 24 23:53:45.019821 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.019775 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5s57p" event={"ID":"92d3f68f-64da-4914-9d0e-66109d7ac351","Type":"ContainerStarted","Data":"22bf322a311638cf849125fd5ac3c68814e196a12d297c39a284356128857f33"} Apr 24 23:53:45.473901 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.473266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:45.473901 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.473511 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:45.473901 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.473575 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:53:47.473556219 +0000 UTC m=+6.078282813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:45.674786 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.674177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:45.674786 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.674375 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:45.674786 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.674394 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:45.674786 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.674433 2569 projected.go:194] Error preparing data for projected volume kube-api-access-bmgwg for pod openshift-network-diagnostics/network-check-target-lsbf2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:45.674786 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.674495 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg podName:44764fbc-9742-4d96-ae9e-d45956e60888 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:47.674475646 +0000 UTC m=+6.279202234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bmgwg" (UniqueName: "kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg") pod "network-check-target-lsbf2" (UID: "44764fbc-9742-4d96-ae9e-d45956e60888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:45.974110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.973586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:45.974110 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.973734 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:45.974110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:45.973818 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:45.974110 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:45.973979 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:46.032571 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:46.032534 2569 generic.go:358] "Generic (PLEG): container finished" podID="20c8b549d04c6a6f4001cebf1b8f071a" containerID="8f2e69728192edae1e35da2fe9c3fe400fd9dab0909b838390a83a259735da83" exitCode=0 Apr 24 23:53:46.033462 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:46.033436 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" event={"ID":"20c8b549d04c6a6f4001cebf1b8f071a","Type":"ContainerDied","Data":"8f2e69728192edae1e35da2fe9c3fe400fd9dab0909b838390a83a259735da83"} Apr 24 23:53:47.039675 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:47.039619 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" event={"ID":"20c8b549d04c6a6f4001cebf1b8f071a","Type":"ContainerStarted","Data":"e37a8b3f39bccb4f9a6b576baef01cf5898fbc809b582c954a77d0a82c469638"} Apr 24 23:53:47.052595 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:47.052536 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-214.ec2.internal" podStartSLOduration=5.052517319 podStartE2EDuration="5.052517319s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:47.051900267 +0000 UTC m=+5.656626884" watchObservedRunningTime="2026-04-24 23:53:47.052517319 +0000 UTC m=+5.657243919" Apr 24 23:53:47.489511 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:47.489473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:47.489675 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.489648 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:47.489750 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.489717 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:53:51.489697683 +0000 UTC m=+10.094424259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:47.691518 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:47.691290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:47.691518 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.691472 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:47.691518 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.691492 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:47.691518 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.691521 2569 projected.go:194] Error preparing data for projected volume kube-api-access-bmgwg for pod openshift-network-diagnostics/network-check-target-lsbf2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:47.691849 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.691579 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg podName:44764fbc-9742-4d96-ae9e-d45956e60888 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:51.691560904 +0000 UTC m=+10.296287484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bmgwg" (UniqueName: "kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg") pod "network-check-target-lsbf2" (UID: "44764fbc-9742-4d96-ae9e-d45956e60888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:47.972390 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:47.972361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:47.972690 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.972665 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:47.973200 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:47.973180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:47.973320 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:47.973300 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:49.972561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:49.972527 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:49.972561 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:49.972548 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:49.973084 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:49.972728 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:49.973084 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:49.972848 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:51.520324 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:51.520281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:51.520813 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.520467 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:51.520813 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.520550 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:53:59.520529019 +0000 UTC m=+18.125255618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:51.722161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:51.721985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:51.722161 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.722158 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:51.722161 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.722178 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:51.722533 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.722191 2569 projected.go:194] Error preparing data for projected volume kube-api-access-bmgwg for pod openshift-network-diagnostics/network-check-target-lsbf2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:51.722533 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.722254 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg podName:44764fbc-9742-4d96-ae9e-d45956e60888 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:59.72223684 +0000 UTC m=+18.326963416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bmgwg" (UniqueName: "kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg") pod "network-check-target-lsbf2" (UID: "44764fbc-9742-4d96-ae9e-d45956e60888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:51.973978 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:51.973896 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:51.974141 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.974049 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:51.974141 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:51.974104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:51.974267 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:51.974217 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:53.973302 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:53.973103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:53.973741 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:53.973119 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:53.973741 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:53.973436 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:53.973741 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:53.973463 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:55.973069 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:55.973033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:55.973508 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:55.973033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:55.973508 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:55.973173 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:55.973508 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:55.973208 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:57.972351 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:57.972296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:57.972351 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:57.972349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:57.972873 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:57.972473 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:57.972873 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:57.972578 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:53:59.582505 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.582462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:59.582973 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.582587 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:59.582973 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.582654 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.582637896 +0000 UTC m=+34.187364472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:59.783466 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.783434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:59.783679 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.783570 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:59.783679 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.783589 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:59.783679 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.783601 2569 projected.go:194] Error preparing data for projected volume kube-api-access-bmgwg for pod openshift-network-diagnostics/network-check-target-lsbf2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:59.783679 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.783661 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg podName:44764fbc-9742-4d96-ae9e-d45956e60888 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.783642328 +0000 UTC m=+34.388368910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bmgwg" (UniqueName: "kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg") pod "network-check-target-lsbf2" (UID: "44764fbc-9742-4d96-ae9e-d45956e60888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:59.936795 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.936727 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pfmxs"] Apr 24 23:53:59.962177 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.962148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:53:59.964889 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.964833 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:59.965008 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.964896 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:59.965008 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.964912 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-l5cdl\"" Apr 24 23:53:59.972591 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.972569 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:53:59.972697 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:53:59.972598 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:53:59.972751 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.972709 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:53:59.972838 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:53:59.972812 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:00.086496 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.086458 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85ll\" (UniqueName: \"kubernetes.io/projected/191956df-226e-4581-bd3c-7661b41d8536-kube-api-access-b85ll\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.086665 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.086587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/191956df-226e-4581-bd3c-7661b41d8536-tmp-dir\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.086665 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.086628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/191956df-226e-4581-bd3c-7661b41d8536-hosts-file\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.187544 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.187468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/191956df-226e-4581-bd3c-7661b41d8536-tmp-dir\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.187544 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.187507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/191956df-226e-4581-bd3c-7661b41d8536-hosts-file\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.187733 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.187566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b85ll\" (UniqueName: \"kubernetes.io/projected/191956df-226e-4581-bd3c-7661b41d8536-kube-api-access-b85ll\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.187733 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.187668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/191956df-226e-4581-bd3c-7661b41d8536-hosts-file\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.187853 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.187836 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/191956df-226e-4581-bd3c-7661b41d8536-tmp-dir\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.198759 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.198728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85ll\" (UniqueName: \"kubernetes.io/projected/191956df-226e-4581-bd3c-7661b41d8536-kube-api-access-b85ll\") pod \"node-resolver-pfmxs\" (UID: \"191956df-226e-4581-bd3c-7661b41d8536\") " pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:00.271622 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:00.271589 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pfmxs" Apr 24 23:54:01.126370 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.126335 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pwpfb"] Apr 24 23:54:01.151802 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.151774 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.151959 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.151843 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:01.296359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.296330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d13d238f-ed70-4223-a161-a53a31be9a63-dbus\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.296359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.296360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.296539 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.296428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d13d238f-ed70-4223-a161-a53a31be9a63-kubelet-config\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.397628 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.397561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d13d238f-ed70-4223-a161-a53a31be9a63-kubelet-config\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.397628 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.397613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d13d238f-ed70-4223-a161-a53a31be9a63-dbus\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.397796 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.397630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.397796 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.397688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d13d238f-ed70-4223-a161-a53a31be9a63-kubelet-config\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.397796 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.397733 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.397796 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.397741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d13d238f-ed70-4223-a161-a53a31be9a63-dbus\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.397796 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.397775 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret podName:d13d238f-ed70-4223-a161-a53a31be9a63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.897762539 +0000 UTC m=+20.502489114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret") pod "global-pull-secret-syncer-pwpfb" (UID: "d13d238f-ed70-4223-a161-a53a31be9a63") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.661125 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:01.661100 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191956df_226e_4581_bd3c_7661b41d8536.slice/crio-a23160c708ff6e13e334bbb0d23a787998b30ddf046f05aed2286b6a87f06db1 WatchSource:0}: Error finding container a23160c708ff6e13e334bbb0d23a787998b30ddf046f05aed2286b6a87f06db1: Status 404 returned error can't find the container with id a23160c708ff6e13e334bbb0d23a787998b30ddf046f05aed2286b6a87f06db1 Apr 24 23:54:01.903119 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.903093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:01.903248 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.903232 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.903308 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.903290 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret podName:d13d238f-ed70-4223-a161-a53a31be9a63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:02.903270498 +0000 UTC m=+21.507997095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret") pod "global-pull-secret-syncer-pwpfb" (UID: "d13d238f-ed70-4223-a161-a53a31be9a63") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.972738 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.972709 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:01.972838 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:01.972745 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:01.972838 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.972815 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:01.972926 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:01.972877 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:02.064932 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.064896 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" event={"ID":"92a57ac1-cd55-47a0-a17d-cf953d38b2ea","Type":"ContainerStarted","Data":"09e10b92a238b09403cc0b4c1b9d3c014caacab43444c746753939192c643ccc"} Apr 24 23:54:02.066119 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.066094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2r7xx" event={"ID":"a9b41727-9688-4980-8671-a860f9ccf954","Type":"ContainerStarted","Data":"673dc2846d61f56a11a059c39220664f23957a68bf9129f8b9d2255aaa3221ab"} Apr 24 23:54:02.067468 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.067438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerStarted","Data":"5f79e50aa125366b09221da30752e4d4b1abb5af4bc30ed8c8f6959b49c308cd"} Apr 24 23:54:02.068832 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.068807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rsd5n" event={"ID":"dd9e66f5-07c5-45ce-92ad-16649e745d96","Type":"ContainerStarted","Data":"69d0cd3323fca8f999023ffc47af5b9dc95a24b5847fd046116015fe3adbeda4"} Apr 24 23:54:02.070203 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.070175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" event={"ID":"ddb0ca40-9985-4e85-85cc-14ecd836b337","Type":"ContainerStarted","Data":"d5d4a4033ad6beed5f9818a75ca3b6c1e3b4d29541a0a5ca3354ab70e3506956"} Apr 24 23:54:02.071358 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.071333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pk5cg" event={"ID":"04a809d7-5a4b-4d1b-b069-41b0cd06e320","Type":"ContainerStarted","Data":"869f862253c25434f9e620c4cf2ffdfbe0e2006fb6b0c9b8b6810efe0194f61f"} Apr 24 23:54:02.072619 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.072601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pfmxs" event={"ID":"191956df-226e-4581-bd3c-7661b41d8536","Type":"ContainerStarted","Data":"bda09d6b28d53967c1f54ae3f9e4ce25814f4505007a580e79255af54ef425df"} Apr 24 23:54:02.072698 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.072623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pfmxs" event={"ID":"191956df-226e-4581-bd3c-7661b41d8536","Type":"ContainerStarted","Data":"a23160c708ff6e13e334bbb0d23a787998b30ddf046f05aed2286b6a87f06db1"} Apr 24 23:54:02.083342 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.083307 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2r7xx" podStartSLOduration=7.952272413 podStartE2EDuration="20.083297121s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.548470116 +0000 UTC m=+3.153196700" lastFinishedPulling="2026-04-24 23:53:56.679494831 +0000 UTC m=+15.284221408" observedRunningTime="2026-04-24 23:54:02.082873287 +0000 UTC m=+20.687599882" watchObservedRunningTime="2026-04-24 23:54:02.083297121 +0000 UTC m=+20.688023719" Apr 24 23:54:02.108480 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.108441 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rsd5n" podStartSLOduration=3.003054313 podStartE2EDuration="20.108428418s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.547878099 +0000 UTC m=+3.152604683" lastFinishedPulling="2026-04-24 23:54:01.653252203 +0000 UTC m=+20.257978788" observedRunningTime="2026-04-24 23:54:02.096212292 +0000 UTC m=+20.700938890" watchObservedRunningTime="2026-04-24 23:54:02.108428418 +0000 UTC m=+20.713155016" Apr 24 23:54:02.108636 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.108617 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pfmxs" podStartSLOduration=3.108611623 podStartE2EDuration="3.108611623s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:02.108423803 +0000 UTC m=+20.713150393" watchObservedRunningTime="2026-04-24 23:54:02.108611623 +0000 UTC m=+20.713338220" Apr 24 23:54:02.121865 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.121820 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4lz5c" podStartSLOduration=3.009959601 podStartE2EDuration="20.121797028s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.544072917 +0000 UTC m=+3.148799496" lastFinishedPulling="2026-04-24 23:54:01.655910309 +0000 UTC m=+20.260636923" observedRunningTime="2026-04-24 23:54:02.121168216 +0000 UTC m=+20.725894808" watchObservedRunningTime="2026-04-24 23:54:02.121797028 +0000 UTC m=+20.726523624" Apr 24 23:54:02.158067 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.158016 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pk5cg" podStartSLOduration=3.039654281 podStartE2EDuration="20.157999091s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.543920891 +0000 UTC m=+3.148647484" lastFinishedPulling="2026-04-24 23:54:01.662265716 +0000 UTC m=+20.266992294" observedRunningTime="2026-04-24 23:54:02.157466126 +0000 UTC m=+20.762192725" watchObservedRunningTime="2026-04-24 23:54:02.157999091 +0000 UTC m=+20.762725691" Apr 24 23:54:02.859970 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.859804 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:54:02.911058 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.910968 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:54:02.859965395Z","UUID":"5a32a84c-db5b-45b3-b37b-5ea37ddb7612","Handler":null,"Name":"","Endpoint":""} Apr 24 23:54:02.911172 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.911105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:02.911252 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:02.911239 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:02.911309 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:02.911300 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret podName:d13d238f-ed70-4223-a161-a53a31be9a63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:04.911287063 +0000 UTC m=+23.516013638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret") pod "global-pull-secret-syncer-pwpfb" (UID: "d13d238f-ed70-4223-a161-a53a31be9a63") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:02.913594 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.913576 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:54:02.913691 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.913602 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:54:02.972187 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:02.972128 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:02.972282 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:02.972219 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:03.076327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.076292 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"513dcfd6ec6081d23465a55f28a1f5ae6eac4b509102ca2e0667c5b86445fdd3"} Apr 24 23:54:03.076327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.076332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"02fae0ce9dd844bc6ca9b1bec19081505c67d25ecc8b95769f03359c4eca9b48"} Apr 24 23:54:03.076586 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.076347 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"3d2ec4d94d6777a83c3fe72a42f8b1c15cc3c0e0487abf446faf026f50f11189"} Apr 24 23:54:03.076586 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.076359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"0c6bd76f8f21ed3104fe81abcfc46be89cfb1e3c5b7cb8cbccd6dba29dcc8cc6"} Apr 24 23:54:03.076586 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.076372 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"a5826c088ad11e1e63035227e873382ae7274c686adfcd1201dea3108c641d44"} Apr 24 23:54:03.076586 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.076380 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"6350b084ea1d57fd73c7229fe45dc9af61dd07c457e494bfd86adfb93c1d1774"} Apr 24 23:54:03.077513 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.077486 2569 generic.go:358] "Generic (PLEG): container finished" podID="cfab2e9e-eb84-4b70-bc59-197bc3f27fb6" containerID="5f79e50aa125366b09221da30752e4d4b1abb5af4bc30ed8c8f6959b49c308cd" exitCode=0 Apr 24 23:54:03.077597 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.077574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerDied","Data":"5f79e50aa125366b09221da30752e4d4b1abb5af4bc30ed8c8f6959b49c308cd"} Apr 24 23:54:03.079302 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.079274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" event={"ID":"92a57ac1-cd55-47a0-a17d-cf953d38b2ea","Type":"ContainerStarted","Data":"e7361bffebc733e2e756597f65d4e9987541a70dcfc0de4e2b06ca2b4faf9fc8"} Apr 24 23:54:03.503189 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.503151 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:54:03.503878 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.503862 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:54:03.972321 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.972229 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:03.972538 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:03.972247 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:03.972538 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:03.972379 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:03.972538 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:03.972432 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:04.083488 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.083444 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" event={"ID":"92a57ac1-cd55-47a0-a17d-cf953d38b2ea","Type":"ContainerStarted","Data":"8b9d255f0f367558247f513ec71954b1b1c8a1f124e3e0e80efba89d201e7454"} Apr 24 23:54:04.084917 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.084890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5s57p" event={"ID":"92d3f68f-64da-4914-9d0e-66109d7ac351","Type":"ContainerStarted","Data":"198b2ffc6c214b473f934720065794db54a93ea666fc13e8be659c22fe0c4a07"} Apr 24 23:54:04.085197 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.085167 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:54:04.085537 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.085518 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2r7xx" Apr 24 23:54:04.100877 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.100838 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dzjgx" podStartSLOduration=3.134956842 podStartE2EDuration="22.100825491s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.551916777 +0000 UTC m=+3.156643367" lastFinishedPulling="2026-04-24 23:54:03.517785437 +0000 UTC m=+22.122512016" observedRunningTime="2026-04-24 23:54:04.100369785 +0000 UTC m=+22.705096376" watchObservedRunningTime="2026-04-24 23:54:04.100825491 +0000 UTC m=+22.705552086" Apr 24 23:54:04.126935 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.126891 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5s57p" podStartSLOduration=5.066471301 podStartE2EDuration="22.126879427s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.546960777 +0000 UTC m=+3.151687355" lastFinishedPulling="2026-04-24 23:54:01.607368906 +0000 UTC m=+20.212095481" observedRunningTime="2026-04-24 23:54:04.126567672 +0000 UTC m=+22.731294271" watchObservedRunningTime="2026-04-24 23:54:04.126879427 +0000 UTC m=+22.731606073" Apr 24 23:54:04.924086 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.923996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:04.924590 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:04.924163 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:04.924590 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:04.924242 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret podName:d13d238f-ed70-4223-a161-a53a31be9a63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.924221376 +0000 UTC m=+27.528947960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret") pod "global-pull-secret-syncer-pwpfb" (UID: "d13d238f-ed70-4223-a161-a53a31be9a63") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:04.973195 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:04.973160 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:04.973345 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:04.973278 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:05.090445 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:05.090387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"63d31e11eecada2a95300f6f09b76d0ba08476dd8428dfec55e31cba330d0eb0"} Apr 24 23:54:05.973092 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:05.973056 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:05.973656 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:05.973100 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:05.973656 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:05.973212 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:05.973656 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:05.973347 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:06.972908 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:06.972833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:06.973074 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:06.972933 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:07.972980 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:07.972820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:07.973362 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:07.972820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:07.973362 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:07.973055 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:07.973362 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:07.973131 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:08.098491 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.098458 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" event={"ID":"0e517da2-c437-430c-aec1-02e2d22665ca","Type":"ContainerStarted","Data":"25f18c1503da35839b05ccd6525570aad33eb1e2588f76a7f1444780c3b62600"} Apr 24 23:54:08.098735 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.098717 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:54:08.100282 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.100257 2569 generic.go:358] "Generic (PLEG): container finished" podID="cfab2e9e-eb84-4b70-bc59-197bc3f27fb6" containerID="d3267c647a260232f846405870e7f1f95056029b07ed6f5ac85c6fef72f37d0c" exitCode=0 Apr 24 23:54:08.100427 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.100291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerDied","Data":"d3267c647a260232f846405870e7f1f95056029b07ed6f5ac85c6fef72f37d0c"} Apr 24 23:54:08.114776 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.114755 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:54:08.127999 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.127959 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" podStartSLOduration=8.59772081 podStartE2EDuration="26.12794784s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.549797597 +0000 UTC m=+3.154524188" lastFinishedPulling="2026-04-24 23:54:02.080024642 +0000 UTC m=+20.684751218" observedRunningTime="2026-04-24 23:54:08.127451071 +0000 UTC m=+26.732177682" watchObservedRunningTime="2026-04-24 23:54:08.12794784 +0000 UTC m=+26.732674483" Apr 24 23:54:08.955362 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.955166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:08.955519 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:08.955330 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:08.955582 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:08.955531 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret podName:d13d238f-ed70-4223-a161-a53a31be9a63 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.955510623 +0000 UTC m=+35.560237229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret") pod "global-pull-secret-syncer-pwpfb" (UID: "d13d238f-ed70-4223-a161-a53a31be9a63") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:08.972480 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:08.972459 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:08.972585 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:08.972570 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:09.052708 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.052682 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lsbf2"] Apr 24 23:54:09.053101 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.052797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:09.053101 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:09.052871 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:09.058988 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.058949 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pwpfb"] Apr 24 23:54:09.059589 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.059569 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-npwg7"] Apr 24 23:54:09.059708 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.059690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:09.059818 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:09.059789 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:09.104815 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.104737 2569 generic.go:358] "Generic (PLEG): container finished" podID="cfab2e9e-eb84-4b70-bc59-197bc3f27fb6" containerID="129fad3f4ef6dd42fa9b4e3656e9fdf62d613dab3fcae92dd5510cd8d853f180" exitCode=0 Apr 24 23:54:09.104945 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.104812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerDied","Data":"129fad3f4ef6dd42fa9b4e3656e9fdf62d613dab3fcae92dd5510cd8d853f180"} Apr 24 23:54:09.104945 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.104839 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:09.105053 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:09.104949 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:09.105114 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.105098 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:09.105510 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.105479 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:54:09.119341 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:09.119320 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:54:10.108505 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:10.108426 2569 generic.go:358] "Generic (PLEG): container finished" podID="cfab2e9e-eb84-4b70-bc59-197bc3f27fb6" containerID="383f9343edcbc1091c6777296361aac051fcb67363fe0c2d8c2729dd86aea318" exitCode=0 Apr 24 23:54:10.108854 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:10.108520 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerDied","Data":"383f9343edcbc1091c6777296361aac051fcb67363fe0c2d8c2729dd86aea318"} Apr 24 23:54:10.108854 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:10.108634 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:10.972471 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:10.972431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:10.972620 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:10.972431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:10.972620 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:10.972560 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:10.972620 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:10.972605 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:10.972620 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:10.972459 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:10.972784 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:10.972758 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:11.110814 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:11.110782 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:11.820033 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:11.819991 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:54:12.128718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:12.128623 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" podUID="0e517da2-c437-430c-aec1-02e2d22665ca" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 23:54:12.972416 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:12.972366 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:12.972598 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:12.972385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:12.972598 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:12.972424 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:12.972598 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:12.972500 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pwpfb" podUID="d13d238f-ed70-4223-a161-a53a31be9a63" Apr 24 23:54:12.972721 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:12.972642 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lsbf2" podUID="44764fbc-9742-4d96-ae9e-d45956e60888" Apr 24 23:54:12.972758 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:12.972730 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npwg7" podUID="37e765cb-b1c9-4330-ac47-4918ba2ebf0a" Apr 24 23:54:14.681820 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.681629 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-214.ec2.internal" event="NodeReady" Apr 24 23:54:14.682264 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.681949 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:14.715135 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.715099 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-779c988769-4lp9b"] Apr 24 23:54:14.739449 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.739420 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ckvqn"] Apr 24 23:54:14.739610 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.739582 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.742454 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.742147 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:54:14.742454 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.742160 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68gn9\"" Apr 24 23:54:14.742454 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.742147 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:54:14.742681 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.742622 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:54:14.752043 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.752019 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:54:14.757993 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.757973 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xvjhv"] Apr 24 23:54:14.758155 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.758136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:14.760579 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.760418 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:14.760579 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.760436 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:14.760579 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.760477 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xmfk4\"" Apr 24 23:54:14.772370 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.772347 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779c988769-4lp9b"] Apr 24 23:54:14.772570 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.772378 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ckvqn"] Apr 24 23:54:14.772570 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.772394 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xvjhv"] Apr 24 23:54:14.772570 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.772503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:14.775456 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.775433 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:14.775556 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.775446 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mn69n\"" Apr 24 23:54:14.775556 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.775531 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:14.775671 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.775628 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:14.899122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/586d6de3-4325-4b34-af6a-576dc929fdce-config-volume\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:14.899317 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899129 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-certificates\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899317 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899317 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7t5h\" (UniqueName: \"kubernetes.io/projected/586d6de3-4325-4b34-af6a-576dc929fdce-kube-api-access-w7t5h\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:14.899317 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-ca-trust-extracted\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/586d6de3-4325-4b34-af6a-576dc929fdce-tmp-dir\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:14.899527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-image-registry-private-configuration\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-bound-sa-token\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:14.899527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:14.899527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-trusted-ca\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899775 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhjv\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-kube-api-access-4hhjv\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.899775 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9lfw\" (UniqueName: \"kubernetes.io/projected/b819f9b3-e5fe-4501-9625-b73431d3105c-kube-api-access-s9lfw\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:14.899775 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.899625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-installation-pull-secrets\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:14.973087 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.973050 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:14.973087 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.973076 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:14.973323 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.973050 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:14.975999 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.975967 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:14.976243 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.976223 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:14.976365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.976257 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:14.976365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.976328 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w9ssg\"" Apr 24 23:54:14.976509 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.976374 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8lkgq\"" Apr 24 23:54:14.976509 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:14.976444 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:15.000149 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000126 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/586d6de3-4325-4b34-af6a-576dc929fdce-tmp-dir\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.000275 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-image-registry-private-configuration\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000275 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000216 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-bound-sa-token\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000275 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-trusted-ca\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhjv\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-kube-api-access-4hhjv\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9lfw\" (UniqueName: \"kubernetes.io/projected/b819f9b3-e5fe-4501-9625-b73431d3105c-kube-api-access-s9lfw\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.000365 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.000374 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-installation-pull-secrets\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000453 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.000443 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert podName:b819f9b3-e5fe-4501-9625-b73431d3105c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.500420936 +0000 UTC m=+34.105147523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert") pod "ingress-canary-xvjhv" (UID: "b819f9b3-e5fe-4501-9625-b73431d3105c") : secret "canary-serving-cert" not found Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/586d6de3-4325-4b34-af6a-576dc929fdce-tmp-dir\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.000567 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls podName:586d6de3-4325-4b34-af6a-576dc929fdce nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.500544226 +0000 UTC m=+34.105270826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls") pod "dns-default-ckvqn" (UID: "586d6de3-4325-4b34-af6a-576dc929fdce") : secret "dns-default-metrics-tls" not found Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/586d6de3-4325-4b34-af6a-576dc929fdce-config-volume\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000651 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-certificates\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7t5h\" (UniqueName: \"kubernetes.io/projected/586d6de3-4325-4b34-af6a-576dc929fdce-kube-api-access-w7t5h\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.000906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.000840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-ca-trust-extracted\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.001233 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.000945 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:15.001233 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.000961 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c988769-4lp9b: secret "image-registry-tls" not found Apr 24 23:54:15.001233 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.001017 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls podName:b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.501000089 +0000 UTC m=+34.105726679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls") pod "image-registry-779c988769-4lp9b" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7") : secret "image-registry-tls" not found Apr 24 23:54:15.001233 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.001162 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/586d6de3-4325-4b34-af6a-576dc929fdce-config-volume\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.001233 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.001189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-certificates\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.001233 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.001223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-ca-trust-extracted\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.001547 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.001493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-trusted-ca\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.004933 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.004909 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-image-registry-private-configuration\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.005079 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.004955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-installation-pull-secrets\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.009293 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.009269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhjv\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-kube-api-access-4hhjv\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.009601 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.009581 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-bound-sa-token\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.009749 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.009631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7t5h\" (UniqueName: \"kubernetes.io/projected/586d6de3-4325-4b34-af6a-576dc929fdce-kube-api-access-w7t5h\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.009829 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.009808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9lfw\" (UniqueName: \"kubernetes.io/projected/b819f9b3-e5fe-4501-9625-b73431d3105c-kube-api-access-s9lfw\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:15.505219 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.505184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:15.505219 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.505222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.505257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505330 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505345 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505335 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505420 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c988769-4lp9b: secret "image-registry-tls" not found Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505391 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert podName:b819f9b3-e5fe-4501-9625-b73431d3105c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.505375672 +0000 UTC m=+35.110102249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert") pod "ingress-canary-xvjhv" (UID: "b819f9b3-e5fe-4501-9625-b73431d3105c") : secret "canary-serving-cert" not found Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505459 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls podName:b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.50544278 +0000 UTC m=+35.110169358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls") pod "image-registry-779c988769-4lp9b" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7") : secret "image-registry-tls" not found Apr 24 23:54:15.505492 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.505477 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls podName:586d6de3-4325-4b34-af6a-576dc929fdce nodeName:}" failed. No retries permitted until 2026-04-24 23:54:16.505467975 +0000 UTC m=+35.110194556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls") pod "dns-default-ckvqn" (UID: "586d6de3-4325-4b34-af6a-576dc929fdce") : secret "dns-default-metrics-tls" not found Apr 24 23:54:15.605583 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.605553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:15.605678 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.605665 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:54:15.605721 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:15.605713 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs podName:37e765cb-b1c9-4330-ac47-4918ba2ebf0a nodeName:}" failed. No retries permitted until 2026-04-24 23:54:47.60570076 +0000 UTC m=+66.210427341 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs") pod "network-metrics-daemon-npwg7" (UID: "37e765cb-b1c9-4330-ac47-4918ba2ebf0a") : secret "metrics-daemon-secret" not found Apr 24 23:54:15.807772 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.807704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:15.810178 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.810160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmgwg\" (UniqueName: \"kubernetes.io/projected/44764fbc-9742-4d96-ae9e-d45956e60888-kube-api-access-bmgwg\") pod \"network-check-target-lsbf2\" (UID: \"44764fbc-9742-4d96-ae9e-d45956e60888\") " pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:15.896527 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:15.896498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:16.201334 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:16.201294 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lsbf2"] Apr 24 23:54:16.205732 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:16.205707 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44764fbc_9742_4d96_ae9e_d45956e60888.slice/crio-4ebd1e666e15eb4042b15e51de057068633d5e82e4005269edb38e3b67e62d71 WatchSource:0}: Error finding container 4ebd1e666e15eb4042b15e51de057068633d5e82e4005269edb38e3b67e62d71: Status 404 returned error can't find the container with id 4ebd1e666e15eb4042b15e51de057068633d5e82e4005269edb38e3b67e62d71 Apr 24 23:54:16.512524 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:16.512309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:16.512524 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:16.512493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:16.512529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512462 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512617 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert podName:b819f9b3-e5fe-4501-9625-b73431d3105c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:18.51259507 +0000 UTC m=+37.117321645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert") pod "ingress-canary-xvjhv" (UID: "b819f9b3-e5fe-4501-9625-b73431d3105c") : secret "canary-serving-cert" not found Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512630 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512690 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls podName:586d6de3-4325-4b34-af6a-576dc929fdce nodeName:}" failed. No retries permitted until 2026-04-24 23:54:18.51267744 +0000 UTC m=+37.117404016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls") pod "dns-default-ckvqn" (UID: "586d6de3-4325-4b34-af6a-576dc929fdce") : secret "dns-default-metrics-tls" not found Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512634 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:16.512717 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512708 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c988769-4lp9b: secret "image-registry-tls" not found Apr 24 23:54:16.513024 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:16.512736 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls podName:b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:18.512729618 +0000 UTC m=+37.117456194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls") pod "image-registry-779c988769-4lp9b" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7") : secret "image-registry-tls" not found Apr 24 23:54:17.020015 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.019974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:17.023335 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.023310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d13d238f-ed70-4223-a161-a53a31be9a63-original-pull-secret\") pod \"global-pull-secret-syncer-pwpfb\" (UID: \"d13d238f-ed70-4223-a161-a53a31be9a63\") " pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:17.084342 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.084315 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pwpfb" Apr 24 23:54:17.123586 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.123551 2569 generic.go:358] "Generic (PLEG): container finished" podID="cfab2e9e-eb84-4b70-bc59-197bc3f27fb6" containerID="c8630694b0c02e5ac082cb5ac2e9b52dfb331322ba88560ed2a0af2679b99e2c" exitCode=0 Apr 24 23:54:17.123733 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.123625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerDied","Data":"c8630694b0c02e5ac082cb5ac2e9b52dfb331322ba88560ed2a0af2679b99e2c"} Apr 24 23:54:17.124785 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.124762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lsbf2" event={"ID":"44764fbc-9742-4d96-ae9e-d45956e60888","Type":"ContainerStarted","Data":"4ebd1e666e15eb4042b15e51de057068633d5e82e4005269edb38e3b67e62d71"} Apr 24 23:54:17.199122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:17.199089 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pwpfb"] Apr 24 23:54:17.202908 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:17.202872 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13d238f_ed70_4223_a161_a53a31be9a63.slice/crio-63051bba9aa7a1c0510f5124280d346763d5724958adec85f81c750fa9b79d1c WatchSource:0}: Error finding container 63051bba9aa7a1c0510f5124280d346763d5724958adec85f81c750fa9b79d1c: Status 404 returned error can't find the container with id 63051bba9aa7a1c0510f5124280d346763d5724958adec85f81c750fa9b79d1c Apr 24 23:54:18.129418 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:18.129365 2569 generic.go:358] "Generic (PLEG): container finished" podID="cfab2e9e-eb84-4b70-bc59-197bc3f27fb6" containerID="5f28050ca0e61fe1aed9e006bee088d0a0ba15d975b78850968df728b58a9fb3" exitCode=0 Apr 24 23:54:18.130199 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:18.129441 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerDied","Data":"5f28050ca0e61fe1aed9e006bee088d0a0ba15d975b78850968df728b58a9fb3"} Apr 24 23:54:18.131575 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:18.131537 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pwpfb" event={"ID":"d13d238f-ed70-4223-a161-a53a31be9a63","Type":"ContainerStarted","Data":"63051bba9aa7a1c0510f5124280d346763d5724958adec85f81c750fa9b79d1c"} Apr 24 23:54:18.531007 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:18.530932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:18.531007 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:18.530970 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:18.531256 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:18.531016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:18.531256 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531143 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:18.531256 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531156 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c988769-4lp9b: secret "image-registry-tls" not found Apr 24 23:54:18.531256 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531221 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls podName:b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.531192277 +0000 UTC m=+41.135918873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls") pod "image-registry-779c988769-4lp9b" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7") : secret "image-registry-tls" not found Apr 24 23:54:18.531256 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531143 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:18.531553 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531290 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert podName:b819f9b3-e5fe-4501-9625-b73431d3105c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.531271494 +0000 UTC m=+41.135998085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert") pod "ingress-canary-xvjhv" (UID: "b819f9b3-e5fe-4501-9625-b73431d3105c") : secret "canary-serving-cert" not found Apr 24 23:54:18.531553 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531143 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:18.531553 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:18.531330 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls podName:586d6de3-4325-4b34-af6a-576dc929fdce nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.531320663 +0000 UTC m=+41.136047238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls") pod "dns-default-ckvqn" (UID: "586d6de3-4325-4b34-af6a-576dc929fdce") : secret "dns-default-metrics-tls" not found Apr 24 23:54:19.140199 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:19.140162 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" event={"ID":"cfab2e9e-eb84-4b70-bc59-197bc3f27fb6","Type":"ContainerStarted","Data":"112fa5c85717914ba9c03d4ddd90e42e256878301078a194da1865d1c4902373"} Apr 24 23:54:19.162272 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:19.162191 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gv8rj" podStartSLOduration=5.713677835 podStartE2EDuration="37.16217027s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:53:44.549060545 +0000 UTC m=+3.153787122" lastFinishedPulling="2026-04-24 23:54:15.997552981 +0000 UTC m=+34.602279557" observedRunningTime="2026-04-24 23:54:19.161035256 +0000 UTC m=+37.765761855" watchObservedRunningTime="2026-04-24 23:54:19.16217027 +0000 UTC m=+37.766896871" Apr 24 23:54:22.148928 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.148711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lsbf2" event={"ID":"44764fbc-9742-4d96-ae9e-d45956e60888","Type":"ContainerStarted","Data":"b6df88cb52aedac2d0ac0ec9e45b632331cbec65ba721231c3cf9f52e2917983"} Apr 24 23:54:22.149382 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.148938 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:22.149948 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.149925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pwpfb" event={"ID":"d13d238f-ed70-4223-a161-a53a31be9a63","Type":"ContainerStarted","Data":"ee0d4a4fa9c29a70c1005868e2071ac153a06a2f8305d3924e2890af705da8aa"} Apr 24 23:54:22.165317 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.165274 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lsbf2" podStartSLOduration=35.109990563 podStartE2EDuration="40.165262521s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:16.207783854 +0000 UTC m=+34.812510429" lastFinishedPulling="2026-04-24 23:54:21.263055797 +0000 UTC m=+39.867782387" observedRunningTime="2026-04-24 23:54:22.164704771 +0000 UTC m=+40.769431369" watchObservedRunningTime="2026-04-24 23:54:22.165262521 +0000 UTC m=+40.769989160" Apr 24 23:54:22.185742 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.185702 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pwpfb" podStartSLOduration=16.76399642 podStartE2EDuration="21.185689678s" podCreationTimestamp="2026-04-24 23:54:01 +0000 UTC" firstStartedPulling="2026-04-24 23:54:17.205092463 +0000 UTC m=+35.809819044" lastFinishedPulling="2026-04-24 23:54:21.626785712 +0000 UTC m=+40.231512302" observedRunningTime="2026-04-24 23:54:22.18525129 +0000 UTC m=+40.789977887" watchObservedRunningTime="2026-04-24 23:54:22.185689678 +0000 UTC m=+40.790416287" Apr 24 23:54:22.319397 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.319366 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l"] Apr 24 23:54:22.347945 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.347920 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l"] Apr 24 23:54:22.348077 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.348031 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" Apr 24 23:54:22.351109 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.351084 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:22.351243 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.351098 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 23:54:22.351454 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.351436 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-psfkt\"" Apr 24 23:54:22.460714 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.460641 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz825\" (UniqueName: \"kubernetes.io/projected/6b80eb80-c7b4-43b8-b29c-8db00b583a49-kube-api-access-kz825\") pod \"migrator-74bb7799d9-cwh6l\" (UID: \"6b80eb80-c7b4-43b8-b29c-8db00b583a49\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" Apr 24 23:54:22.561129 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.561096 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz825\" (UniqueName: \"kubernetes.io/projected/6b80eb80-c7b4-43b8-b29c-8db00b583a49-kube-api-access-kz825\") pod \"migrator-74bb7799d9-cwh6l\" (UID: \"6b80eb80-c7b4-43b8-b29c-8db00b583a49\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" Apr 24 23:54:22.561296 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.561138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:22.561296 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.561157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:22.561296 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.561187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:22.561296 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561266 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:22.561488 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561346 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert podName:b819f9b3-e5fe-4501-9625-b73431d3105c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:30.5613278 +0000 UTC m=+49.166054377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert") pod "ingress-canary-xvjhv" (UID: "b819f9b3-e5fe-4501-9625-b73431d3105c") : secret "canary-serving-cert" not found Apr 24 23:54:22.561488 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561273 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:22.561488 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561428 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls podName:586d6de3-4325-4b34-af6a-576dc929fdce nodeName:}" failed. No retries permitted until 2026-04-24 23:54:30.561416166 +0000 UTC m=+49.166142756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls") pod "dns-default-ckvqn" (UID: "586d6de3-4325-4b34-af6a-576dc929fdce") : secret "dns-default-metrics-tls" not found Apr 24 23:54:22.561488 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561273 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:22.561488 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561448 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c988769-4lp9b: secret "image-registry-tls" not found Apr 24 23:54:22.561488 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:22.561482 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls podName:b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:30.561471521 +0000 UTC m=+49.166198098 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls") pod "image-registry-779c988769-4lp9b" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7") : secret "image-registry-tls" not found Apr 24 23:54:22.575839 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.575813 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz825\" (UniqueName: \"kubernetes.io/projected/6b80eb80-c7b4-43b8-b29c-8db00b583a49-kube-api-access-kz825\") pod \"migrator-74bb7799d9-cwh6l\" (UID: \"6b80eb80-c7b4-43b8-b29c-8db00b583a49\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" Apr 24 23:54:22.656269 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.656236 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" Apr 24 23:54:22.775638 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:22.775608 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l"] Apr 24 23:54:22.778219 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:22.778192 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b80eb80_c7b4_43b8_b29c_8db00b583a49.slice/crio-f899c6b898ea0fee25d2f7e11c0e5e0284a1be7d19849387790f2ce5c81813b5 WatchSource:0}: Error finding container f899c6b898ea0fee25d2f7e11c0e5e0284a1be7d19849387790f2ce5c81813b5: Status 404 returned error can't find the container with id f899c6b898ea0fee25d2f7e11c0e5e0284a1be7d19849387790f2ce5c81813b5 Apr 24 23:54:23.153182 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:23.153140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" event={"ID":"6b80eb80-c7b4-43b8-b29c-8db00b583a49","Type":"ContainerStarted","Data":"f899c6b898ea0fee25d2f7e11c0e5e0284a1be7d19849387790f2ce5c81813b5"} Apr 24 23:54:23.583335 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:23.583304 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pfmxs_191956df-226e-4581-bd3c-7661b41d8536/dns-node-resolver/0.log" Apr 24 23:54:24.784566 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.784507 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rsd5n_dd9e66f5-07c5-45ce-92ad-16649e745d96/node-ca/0.log" Apr 24 23:54:24.832042 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.832014 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kzzjg"] Apr 24 23:54:24.835285 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.835270 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:24.837607 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.837579 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 23:54:24.838621 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.838469 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 23:54:24.838621 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.838514 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 23:54:24.838621 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.838444 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-zp4tf\"" Apr 24 23:54:24.838812 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.838668 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 23:54:24.843773 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.843744 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kzzjg"] Apr 24 23:54:24.977663 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.977629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-signing-cabundle\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:24.977806 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.977690 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-signing-key\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:24.977847 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:24.977795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vhx\" (UniqueName: \"kubernetes.io/projected/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-kube-api-access-s9vhx\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.079162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.079083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-signing-key\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.079315 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.079175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vhx\" (UniqueName: \"kubernetes.io/projected/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-kube-api-access-s9vhx\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.079315 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.079225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-signing-cabundle\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.079896 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.079875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-signing-cabundle\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.081583 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.081564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-signing-key\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.089671 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.089653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vhx\" (UniqueName: \"kubernetes.io/projected/29f1d8c5-c0a1-4080-9ad7-48c60bb33da0-kube-api-access-s9vhx\") pod \"service-ca-865cb79987-kzzjg\" (UID: \"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0\") " pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.145115 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.145089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kzzjg" Apr 24 23:54:25.158612 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.158586 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" event={"ID":"6b80eb80-c7b4-43b8-b29c-8db00b583a49","Type":"ContainerStarted","Data":"d6a0773d47b30451f10a767b374bba6398378910837903e895a34a3e7244375a"} Apr 24 23:54:25.158612 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.158617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" event={"ID":"6b80eb80-c7b4-43b8-b29c-8db00b583a49","Type":"ContainerStarted","Data":"46ba71ad509b79c635827a1fd3ac47c554749b57f8dbf1690eb0a1783b515156"} Apr 24 23:54:25.174826 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.174782 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwh6l" podStartSLOduration=1.431344745 podStartE2EDuration="3.174765915s" podCreationTimestamp="2026-04-24 23:54:22 +0000 UTC" firstStartedPulling="2026-04-24 23:54:22.780127856 +0000 UTC m=+41.384854432" lastFinishedPulling="2026-04-24 23:54:24.523549014 +0000 UTC m=+43.128275602" observedRunningTime="2026-04-24 23:54:25.173706931 +0000 UTC m=+43.778433529" watchObservedRunningTime="2026-04-24 23:54:25.174765915 +0000 UTC m=+43.779492513" Apr 24 23:54:25.254221 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:25.254193 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kzzjg"] Apr 24 23:54:25.257081 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:25.257049 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f1d8c5_c0a1_4080_9ad7_48c60bb33da0.slice/crio-54514c56ff8f2913b2eedc06a16f24c5e80e509df8ddd74c654ee1a2cf867af0 WatchSource:0}: Error finding container 54514c56ff8f2913b2eedc06a16f24c5e80e509df8ddd74c654ee1a2cf867af0: Status 404 returned error can't find the container with id 54514c56ff8f2913b2eedc06a16f24c5e80e509df8ddd74c654ee1a2cf867af0 Apr 24 23:54:26.161629 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:26.161589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kzzjg" event={"ID":"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0","Type":"ContainerStarted","Data":"54514c56ff8f2913b2eedc06a16f24c5e80e509df8ddd74c654ee1a2cf867af0"} Apr 24 23:54:28.167436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:28.167388 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kzzjg" event={"ID":"29f1d8c5-c0a1-4080-9ad7-48c60bb33da0","Type":"ContainerStarted","Data":"ee66ce64857f5c3db3f1a493dbed85cd49609a40be31c133cdabbe946e812beb"} Apr 24 23:54:30.620085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:30.620046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:30.620085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:30.620087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:30.620159 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620185 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620237 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls podName:586d6de3-4325-4b34-af6a-576dc929fdce nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.620224477 +0000 UTC m=+65.224951053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls") pod "dns-default-ckvqn" (UID: "586d6de3-4325-4b34-af6a-576dc929fdce") : secret "dns-default-metrics-tls" not found Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620263 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620277 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c988769-4lp9b: secret "image-registry-tls" not found Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620316 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls podName:b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.620305028 +0000 UTC m=+65.225031605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls") pod "image-registry-779c988769-4lp9b" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7") : secret "image-registry-tls" not found Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620317 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:30.620530 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:30.620360 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert podName:b819f9b3-e5fe-4501-9625-b73431d3105c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.620344778 +0000 UTC m=+65.225071374 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert") pod "ingress-canary-xvjhv" (UID: "b819f9b3-e5fe-4501-9625-b73431d3105c") : secret "canary-serving-cert" not found Apr 24 23:54:42.122245 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:42.122217 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xnqmd" Apr 24 23:54:42.147783 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:42.147729 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-kzzjg" podStartSLOduration=16.080102765 podStartE2EDuration="18.147714465s" podCreationTimestamp="2026-04-24 23:54:24 +0000 UTC" firstStartedPulling="2026-04-24 23:54:25.258805627 +0000 UTC m=+43.863532202" lastFinishedPulling="2026-04-24 23:54:27.326417326 +0000 UTC m=+45.931143902" observedRunningTime="2026-04-24 23:54:28.18176263 +0000 UTC m=+46.786489230" watchObservedRunningTime="2026-04-24 23:54:42.147714465 +0000 UTC m=+60.752441065" Apr 24 23:54:46.639086 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.639045 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:46.639493 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.639111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:46.639493 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.639139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:46.641419 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.641384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/586d6de3-4325-4b34-af6a-576dc929fdce-metrics-tls\") pod \"dns-default-ckvqn\" (UID: \"586d6de3-4325-4b34-af6a-576dc929fdce\") " pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:46.641537 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.641513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b819f9b3-e5fe-4501-9625-b73431d3105c-cert\") pod \"ingress-canary-xvjhv\" (UID: \"b819f9b3-e5fe-4501-9625-b73431d3105c\") " pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:46.641597 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.641581 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"image-registry-779c988769-4lp9b\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:46.856554 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.856525 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68gn9\"" Apr 24 23:54:46.864545 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.864527 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:46.870822 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.870799 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xmfk4\"" Apr 24 23:54:46.879258 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.879234 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:46.884062 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.884041 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mn69n\"" Apr 24 23:54:46.893034 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:46.892979 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xvjhv" Apr 24 23:54:47.006954 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.006897 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779c988769-4lp9b"] Apr 24 23:54:47.010626 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:47.010593 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ca4df1_8494_46e1_b4e0_c81aba8ca1a7.slice/crio-852658d691f413501fca9051d2bca665fe1eecf57ca5f19c62b95517e21ae0af WatchSource:0}: Error finding container 852658d691f413501fca9051d2bca665fe1eecf57ca5f19c62b95517e21ae0af: Status 404 returned error can't find the container with id 852658d691f413501fca9051d2bca665fe1eecf57ca5f19c62b95517e21ae0af Apr 24 23:54:47.020754 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.020614 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ckvqn"] Apr 24 23:54:47.023743 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:47.023717 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod586d6de3_4325_4b34_af6a_576dc929fdce.slice/crio-ec24a7496a7fe916e9fd9b5505568019e91d4ae17d6b5c41ef3055ee39a598f2 WatchSource:0}: Error finding container ec24a7496a7fe916e9fd9b5505568019e91d4ae17d6b5c41ef3055ee39a598f2: Status 404 returned error can't find the container with id ec24a7496a7fe916e9fd9b5505568019e91d4ae17d6b5c41ef3055ee39a598f2 Apr 24 23:54:47.033716 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.033691 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xvjhv"] Apr 24 23:54:47.036265 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:47.036244 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb819f9b3_e5fe_4501_9625_b73431d3105c.slice/crio-b44d64c9313f6d61e3158814f23e0c097f6668776971ba546e2ba995ad4c55c8 WatchSource:0}: Error finding container b44d64c9313f6d61e3158814f23e0c097f6668776971ba546e2ba995ad4c55c8: Status 404 returned error can't find the container with id b44d64c9313f6d61e3158814f23e0c097f6668776971ba546e2ba995ad4c55c8 Apr 24 23:54:47.206964 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.206879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xvjhv" event={"ID":"b819f9b3-e5fe-4501-9625-b73431d3105c","Type":"ContainerStarted","Data":"b44d64c9313f6d61e3158814f23e0c097f6668776971ba546e2ba995ad4c55c8"} Apr 24 23:54:47.208257 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.208227 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c988769-4lp9b" event={"ID":"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7","Type":"ContainerStarted","Data":"6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599"} Apr 24 23:54:47.208396 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.208263 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c988769-4lp9b" event={"ID":"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7","Type":"ContainerStarted","Data":"852658d691f413501fca9051d2bca665fe1eecf57ca5f19c62b95517e21ae0af"} Apr 24 23:54:47.208396 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.208363 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:54:47.209231 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.209206 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ckvqn" event={"ID":"586d6de3-4325-4b34-af6a-576dc929fdce","Type":"ContainerStarted","Data":"ec24a7496a7fe916e9fd9b5505568019e91d4ae17d6b5c41ef3055ee39a598f2"} Apr 24 23:54:47.232122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.232068 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-779c988769-4lp9b" podStartSLOduration=65.232051036 podStartE2EDuration="1m5.232051036s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:47.230498069 +0000 UTC m=+65.835224671" watchObservedRunningTime="2026-04-24 23:54:47.232051036 +0000 UTC m=+65.836777634" Apr 24 23:54:47.650149 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.650108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:47.654125 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.654096 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e765cb-b1c9-4330-ac47-4918ba2ebf0a-metrics-certs\") pod \"network-metrics-daemon-npwg7\" (UID: \"37e765cb-b1c9-4330-ac47-4918ba2ebf0a\") " pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:47.692691 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.692660 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w9ssg\"" Apr 24 23:54:47.700804 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.700777 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npwg7" Apr 24 23:54:47.853546 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:47.853516 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-npwg7"] Apr 24 23:54:47.859500 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:47.858668 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e765cb_b1c9_4330_ac47_4918ba2ebf0a.slice/crio-0b3a157b6c627d068f30c5ab48487b5ab1ed87d3b8b0eaf85db1b8cf7435235a WatchSource:0}: Error finding container 0b3a157b6c627d068f30c5ab48487b5ab1ed87d3b8b0eaf85db1b8cf7435235a: Status 404 returned error can't find the container with id 0b3a157b6c627d068f30c5ab48487b5ab1ed87d3b8b0eaf85db1b8cf7435235a Apr 24 23:54:48.213302 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:48.213254 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npwg7" event={"ID":"37e765cb-b1c9-4330-ac47-4918ba2ebf0a","Type":"ContainerStarted","Data":"0b3a157b6c627d068f30c5ab48487b5ab1ed87d3b8b0eaf85db1b8cf7435235a"} Apr 24 23:54:49.916607 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.916572 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9"] Apr 24 23:54:49.930323 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.930297 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj"] Apr 24 23:54:49.930523 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.930500 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:49.936185 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.936163 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:54:49.936472 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.936455 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lv7qj\"" Apr 24 23:54:49.936746 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.936614 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:54:49.936746 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.936632 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 23:54:49.936746 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.936632 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:54:49.940549 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.940527 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9"] Apr 24 23:54:49.940671 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.940657 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:49.947928 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.947907 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj"] Apr 24 23:54:49.950302 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:49.950282 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 23:54:50.025734 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.025711 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5695878b76-x468z"] Apr 24 23:54:50.033204 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.033090 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.036905 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.036850 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:54:50.037490 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.037468 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:54:50.037685 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.037660 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 23:54:50.037946 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.037929 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:54:50.038181 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.038161 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 23:54:50.038822 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.038753 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:54:50.039816 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.039770 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:54:50.039919 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.039815 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w2hp6\"" Apr 24 23:54:50.040263 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.040204 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ddrd5"] Apr 24 23:54:50.054847 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.054820 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5695878b76-x468z"] Apr 24 23:54:50.054955 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.054946 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:54:50.058361 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.057688 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hp46j\"" Apr 24 23:54:50.061596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.061574 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779c988769-4lp9b"] Apr 24 23:54:50.063329 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.063308 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ddrd5"] Apr 24 23:54:50.075674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.069898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tscvs\" (UniqueName: \"kubernetes.io/projected/e969b2dc-2e72-4829-b6e2-ba78de037eaf-kube-api-access-tscvs\") pod \"managed-serviceaccount-addon-agent-58cbcbff7-ft9x9\" (UID: \"e969b2dc-2e72-4829-b6e2-ba78de037eaf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.075674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.069938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e969b2dc-2e72-4829-b6e2-ba78de037eaf-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58cbcbff7-ft9x9\" (UID: \"e969b2dc-2e72-4829-b6e2-ba78de037eaf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.075674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.069994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84fefcad-12b8-4f83-93a4-280b95fa779c-tmp\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.075674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.070018 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/84fefcad-12b8-4f83-93a4-280b95fa779c-klusterlet-config\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.075674 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.070056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfxbj\" (UniqueName: \"kubernetes.io/projected/84fefcad-12b8-4f83-93a4-280b95fa779c-kube-api-access-kfxbj\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.129457 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.129264 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7xqlq"] Apr 24 23:54:50.148044 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.148025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.151967 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.151834 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gprwq\"" Apr 24 23:54:50.151967 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.151872 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:54:50.152288 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.152244 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:54:50.152499 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.152481 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:54:50.152583 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.152510 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:54:50.152979 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.152960 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7xqlq"] Apr 24 23:54:50.171209 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84fefcad-12b8-4f83-93a4-280b95fa779c-tmp\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.171318 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171240 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-oauth-serving-cert\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.171318 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tscvs\" (UniqueName: \"kubernetes.io/projected/e969b2dc-2e72-4829-b6e2-ba78de037eaf-kube-api-access-tscvs\") pod \"managed-serviceaccount-addon-agent-58cbcbff7-ft9x9\" (UID: \"e969b2dc-2e72-4829-b6e2-ba78de037eaf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.171448 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e969b2dc-2e72-4829-b6e2-ba78de037eaf-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58cbcbff7-ft9x9\" (UID: \"e969b2dc-2e72-4829-b6e2-ba78de037eaf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.171448 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/84fefcad-12b8-4f83-93a4-280b95fa779c-klusterlet-config\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.171448 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfxbj\" (UniqueName: \"kubernetes.io/projected/84fefcad-12b8-4f83-93a4-280b95fa779c-kube-api-access-kfxbj\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.171448 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171442 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-serving-cert\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.172249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-oauth-config\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.172249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-config\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.172249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84fefcad-12b8-4f83-93a4-280b95fa779c-tmp\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.172249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-service-ca\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.172249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqgb\" (UniqueName: \"kubernetes.io/projected/37d2d963-9753-4296-ab1e-ae0df5ec3511-kube-api-access-ghqgb\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.172249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.171735 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpghw\" (UniqueName: \"kubernetes.io/projected/582002e5-6e19-4c7c-afa8-2c680db672f4-kube-api-access-vpghw\") pod \"downloads-6bcc868b7-ddrd5\" (UID: \"582002e5-6e19-4c7c-afa8-2c680db672f4\") " pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:54:50.173990 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.173947 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e969b2dc-2e72-4829-b6e2-ba78de037eaf-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58cbcbff7-ft9x9\" (UID: \"e969b2dc-2e72-4829-b6e2-ba78de037eaf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.174804 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.174762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/84fefcad-12b8-4f83-93a4-280b95fa779c-klusterlet-config\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.190617 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.190594 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfxbj\" (UniqueName: \"kubernetes.io/projected/84fefcad-12b8-4f83-93a4-280b95fa779c-kube-api-access-kfxbj\") pod \"klusterlet-addon-workmgr-57dcbdb694-vtnjj\" (UID: \"84fefcad-12b8-4f83-93a4-280b95fa779c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.191847 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.191809 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tscvs\" (UniqueName: \"kubernetes.io/projected/e969b2dc-2e72-4829-b6e2-ba78de037eaf-kube-api-access-tscvs\") pod \"managed-serviceaccount-addon-agent-58cbcbff7-ft9x9\" (UID: \"e969b2dc-2e72-4829-b6e2-ba78de037eaf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.198711 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.198572 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76d68c7b68-hgbqp"] Apr 24 23:54:50.207290 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.207267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.219849 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.219506 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76d68c7b68-hgbqp"] Apr 24 23:54:50.224002 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.223978 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npwg7" event={"ID":"37e765cb-b1c9-4330-ac47-4918ba2ebf0a","Type":"ContainerStarted","Data":"47ef4f962b4b87b75887d0d59e4093fd30872154dfe3eb76bbad108de562ffc8"} Apr 24 23:54:50.227164 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.227140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ckvqn" event={"ID":"586d6de3-4325-4b34-af6a-576dc929fdce","Type":"ContainerStarted","Data":"403ff5fe5ecc137564518a3542d3825f00541b79a451741b5590cdf94d3185d7"} Apr 24 23:54:50.227278 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.227170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ckvqn" event={"ID":"586d6de3-4325-4b34-af6a-576dc929fdce","Type":"ContainerStarted","Data":"e36be94fe459a7881bacb6de724c2e3048ff7e7098475899457c24a80fe20578"} Apr 24 23:54:50.227619 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.227603 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ckvqn" Apr 24 23:54:50.228662 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.228629 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xvjhv" event={"ID":"b819f9b3-e5fe-4501-9625-b73431d3105c","Type":"ContainerStarted","Data":"5d8dfbae1c87582759042db1245b6e7e59712cd46d6e35f838290441babf6ab4"} Apr 24 23:54:50.250225 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.250200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" Apr 24 23:54:50.260935 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.260913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:50.265198 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.264531 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xvjhv" podStartSLOduration=33.51301884 podStartE2EDuration="36.26451599s" podCreationTimestamp="2026-04-24 23:54:14 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.037923714 +0000 UTC m=+65.642650290" lastFinishedPulling="2026-04-24 23:54:49.78942086 +0000 UTC m=+68.394147440" observedRunningTime="2026-04-24 23:54:50.263975886 +0000 UTC m=+68.868702494" watchObservedRunningTime="2026-04-24 23:54:50.26451599 +0000 UTC m=+68.869242591" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-serving-cert\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-oauth-config\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-config\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-service-ca\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272417 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqgb\" (UniqueName: \"kubernetes.io/projected/37d2d963-9753-4296-ab1e-ae0df5ec3511-kube-api-access-ghqgb\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpghw\" (UniqueName: \"kubernetes.io/projected/582002e5-6e19-4c7c-afa8-2c680db672f4-kube-api-access-vpghw\") pod \"downloads-6bcc868b7-ddrd5\" (UID: \"582002e5-6e19-4c7c-afa8-2c680db672f4\") " pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272506 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4d74dd05-34d6-465d-b59b-4694e782b05f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4d74dd05-34d6-465d-b59b-4694e782b05f-data-volume\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4d74dd05-34d6-465d-b59b-4694e782b05f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.273193 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4d74dd05-34d6-465d-b59b-4694e782b05f-crio-socket\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.275718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.272704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9cr\" (UniqueName: \"kubernetes.io/projected/4d74dd05-34d6-465d-b59b-4694e782b05f-kube-api-access-bb9cr\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.275718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.273277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-oauth-serving-cert\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.275718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.274079 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-oauth-serving-cert\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.275718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.275184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-service-ca\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.276572 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.276523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-config\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.277728 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.277682 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-serving-cert\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.279497 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.279444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-oauth-config\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.283442 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.283256 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ckvqn" podStartSLOduration=33.519124414 podStartE2EDuration="36.283244261s" podCreationTimestamp="2026-04-24 23:54:14 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.025304181 +0000 UTC m=+65.630030757" lastFinishedPulling="2026-04-24 23:54:49.789424028 +0000 UTC m=+68.394150604" observedRunningTime="2026-04-24 23:54:50.283109252 +0000 UTC m=+68.887835885" watchObservedRunningTime="2026-04-24 23:54:50.283244261 +0000 UTC m=+68.887970861" Apr 24 23:54:50.290898 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.290842 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqgb\" (UniqueName: \"kubernetes.io/projected/37d2d963-9753-4296-ab1e-ae0df5ec3511-kube-api-access-ghqgb\") pod \"console-5695878b76-x468z\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.291658 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.291614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpghw\" (UniqueName: \"kubernetes.io/projected/582002e5-6e19-4c7c-afa8-2c680db672f4-kube-api-access-vpghw\") pod \"downloads-6bcc868b7-ddrd5\" (UID: \"582002e5-6e19-4c7c-afa8-2c680db672f4\") " pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:54:50.357043 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.356286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5695878b76-x468z" Apr 24 23:54:50.375350 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375309 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-registry-certificates\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.375500 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-trusted-ca\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.375500 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-installation-pull-secrets\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.375623 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnh7\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-kube-api-access-wwnh7\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.375623 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4d74dd05-34d6-465d-b59b-4694e782b05f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.375623 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4d74dd05-34d6-465d-b59b-4694e782b05f-data-volume\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.375767 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-registry-tls\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.375767 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375657 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4d74dd05-34d6-465d-b59b-4694e782b05f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.375767 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375684 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-bound-sa-token\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.375767 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4d74dd05-34d6-465d-b59b-4694e782b05f-crio-socket\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.375767 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375764 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9cr\" (UniqueName: \"kubernetes.io/projected/4d74dd05-34d6-465d-b59b-4694e782b05f-kube-api-access-bb9cr\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.376010 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-ca-trust-extracted\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.376010 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.375833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-image-registry-private-configuration\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.376342 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.376318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4d74dd05-34d6-465d-b59b-4694e782b05f-data-volume\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.377065 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.377037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4d74dd05-34d6-465d-b59b-4694e782b05f-crio-socket\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.377231 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.377206 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4d74dd05-34d6-465d-b59b-4694e782b05f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.379375 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.379354 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:54:50.379824 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.379799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4d74dd05-34d6-465d-b59b-4694e782b05f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.386321 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.386263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9cr\" (UniqueName: \"kubernetes.io/projected/4d74dd05-34d6-465d-b59b-4694e782b05f-kube-api-access-bb9cr\") pod \"insights-runtime-extractor-7xqlq\" (UID: \"4d74dd05-34d6-465d-b59b-4694e782b05f\") " pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.412579 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.412522 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj"] Apr 24 23:54:50.417172 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:50.416854 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fefcad_12b8_4f83_93a4_280b95fa779c.slice/crio-f97efd7da47e705366cbf0636d30320e85491d67fd5cee1094627b08fa867cc8 WatchSource:0}: Error finding container f97efd7da47e705366cbf0636d30320e85491d67fd5cee1094627b08fa867cc8: Status 404 returned error can't find the container with id f97efd7da47e705366cbf0636d30320e85491d67fd5cee1094627b08fa867cc8 Apr 24 23:54:50.432381 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.432243 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9"] Apr 24 23:54:50.466326 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.465877 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7xqlq" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-ca-trust-extracted\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-image-registry-private-configuration\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-registry-certificates\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-trusted-ca\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-installation-pull-secrets\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnh7\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-kube-api-access-wwnh7\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-registry-tls\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.477383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-bound-sa-token\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478327 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.478217 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-registry-certificates\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.478864 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.478706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-ca-trust-extracted\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.480482 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.479359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-trusted-ca\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.482089 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.482061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-installation-pull-secrets\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.482181 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.482121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-registry-tls\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.483266 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.483223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-image-registry-private-configuration\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.498980 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.498919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnh7\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-kube-api-access-wwnh7\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.500060 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.500020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25b6cdb5-aed3-4024-93b1-63dd4d5a7299-bound-sa-token\") pod \"image-registry-76d68c7b68-hgbqp\" (UID: \"25b6cdb5-aed3-4024-93b1-63dd4d5a7299\") " pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.512580 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.512551 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5695878b76-x468z"] Apr 24 23:54:50.515894 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:50.515864 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d2d963_9753_4296_ab1e_ae0df5ec3511.slice/crio-09cef4c7b73d6170ff3ec9b9d2a90a0ae3bcc80acae45ec0d7d24b06e36b7484 WatchSource:0}: Error finding container 09cef4c7b73d6170ff3ec9b9d2a90a0ae3bcc80acae45ec0d7d24b06e36b7484: Status 404 returned error can't find the container with id 09cef4c7b73d6170ff3ec9b9d2a90a0ae3bcc80acae45ec0d7d24b06e36b7484 Apr 24 23:54:50.516948 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.516858 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:50.548658 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.548631 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ddrd5"] Apr 24 23:54:50.550966 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:50.550932 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582002e5_6e19_4c7c_afa8_2c680db672f4.slice/crio-7a6b3d6a01f4466d2ba29a15e88de2b89972dd1ad21b15b74c3e64d2edd8c706 WatchSource:0}: Error finding container 7a6b3d6a01f4466d2ba29a15e88de2b89972dd1ad21b15b74c3e64d2edd8c706: Status 404 returned error can't find the container with id 7a6b3d6a01f4466d2ba29a15e88de2b89972dd1ad21b15b74c3e64d2edd8c706 Apr 24 23:54:50.605101 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.605058 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7xqlq"] Apr 24 23:54:50.656907 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.656872 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76d68c7b68-hgbqp"] Apr 24 23:54:50.660116 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:50.660087 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b6cdb5_aed3_4024_93b1_63dd4d5a7299.slice/crio-22da08336b0b0e8ab302ea37c12cf75d351887a7ff0dc64d2b75836bee1dce37 WatchSource:0}: Error finding container 22da08336b0b0e8ab302ea37c12cf75d351887a7ff0dc64d2b75836bee1dce37: Status 404 returned error can't find the container with id 22da08336b0b0e8ab302ea37c12cf75d351887a7ff0dc64d2b75836bee1dce37 Apr 24 23:54:50.967315 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.966128 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6"] Apr 24 23:54:50.983826 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.983575 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6"] Apr 24 23:54:50.983826 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.983730 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:50.986466 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.986257 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 23:54:50.991437 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:50.991349 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9nqlt\"" Apr 24 23:54:51.082724 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.082618 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/499f22b6-5f9c-4b8d-9958-51fbade0900a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x8pb6\" (UID: \"499f22b6-5f9c-4b8d-9958-51fbade0900a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:51.184819 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.184151 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/499f22b6-5f9c-4b8d-9958-51fbade0900a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x8pb6\" (UID: \"499f22b6-5f9c-4b8d-9958-51fbade0900a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:51.184819 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:51.184427 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 23:54:51.184819 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:54:51.184502 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/499f22b6-5f9c-4b8d-9958-51fbade0900a-tls-certificates podName:499f22b6-5f9c-4b8d-9958-51fbade0900a nodeName:}" failed. No retries permitted until 2026-04-24 23:54:51.684477097 +0000 UTC m=+70.289203676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/499f22b6-5f9c-4b8d-9958-51fbade0900a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-x8pb6" (UID: "499f22b6-5f9c-4b8d-9958-51fbade0900a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 23:54:51.240430 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.239951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npwg7" event={"ID":"37e765cb-b1c9-4330-ac47-4918ba2ebf0a","Type":"ContainerStarted","Data":"024ee41d8141d20bbf57078ab99e3876968fbae50413df7d99d95184de9bb7e9"} Apr 24 23:54:51.244430 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.243710 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" event={"ID":"25b6cdb5-aed3-4024-93b1-63dd4d5a7299","Type":"ContainerStarted","Data":"dec1e3281078a768275655463b3eb317cd97d2afceb450dfd7d08e2bc5e8f190"} Apr 24 23:54:51.244430 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.243747 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" event={"ID":"25b6cdb5-aed3-4024-93b1-63dd4d5a7299","Type":"ContainerStarted","Data":"22da08336b0b0e8ab302ea37c12cf75d351887a7ff0dc64d2b75836bee1dce37"} Apr 24 23:54:51.244430 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.244266 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:54:51.247054 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.246982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ddrd5" event={"ID":"582002e5-6e19-4c7c-afa8-2c680db672f4","Type":"ContainerStarted","Data":"7a6b3d6a01f4466d2ba29a15e88de2b89972dd1ad21b15b74c3e64d2edd8c706"} Apr 24 23:54:51.250332 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.250307 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5695878b76-x468z" event={"ID":"37d2d963-9753-4296-ab1e-ae0df5ec3511","Type":"ContainerStarted","Data":"09cef4c7b73d6170ff3ec9b9d2a90a0ae3bcc80acae45ec0d7d24b06e36b7484"} Apr 24 23:54:51.253477 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.253443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" event={"ID":"e969b2dc-2e72-4829-b6e2-ba78de037eaf","Type":"ContainerStarted","Data":"d4a34a120628da3e50cf2fb098f7b28a3f0028803076fdaa24dbf83b3fa4b8f6"} Apr 24 23:54:51.258003 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.257931 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7xqlq" event={"ID":"4d74dd05-34d6-465d-b59b-4694e782b05f","Type":"ContainerStarted","Data":"d9913f51b01106687b45d89c20208b320530654bfbdf1a785a4313c345815abb"} Apr 24 23:54:51.258003 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.257961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7xqlq" event={"ID":"4d74dd05-34d6-465d-b59b-4694e782b05f","Type":"ContainerStarted","Data":"059ff0c8ce394290e0056f45f1d36bbca0ced81065545331dbddb9067ca60b5e"} Apr 24 23:54:51.264113 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.262901 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" event={"ID":"84fefcad-12b8-4f83-93a4-280b95fa779c","Type":"ContainerStarted","Data":"f97efd7da47e705366cbf0636d30320e85491d67fd5cee1094627b08fa867cc8"} Apr 24 23:54:51.265131 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.264938 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-npwg7" podStartSLOduration=67.094419609 podStartE2EDuration="1m9.26492315s" podCreationTimestamp="2026-04-24 23:53:42 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.86102875 +0000 UTC m=+66.465755332" lastFinishedPulling="2026-04-24 23:54:50.03153229 +0000 UTC m=+68.636258873" observedRunningTime="2026-04-24 23:54:51.258874231 +0000 UTC m=+69.863600828" watchObservedRunningTime="2026-04-24 23:54:51.26492315 +0000 UTC m=+69.869649749" Apr 24 23:54:51.688609 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.688568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/499f22b6-5f9c-4b8d-9958-51fbade0900a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x8pb6\" (UID: \"499f22b6-5f9c-4b8d-9958-51fbade0900a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:51.700513 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.700453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/499f22b6-5f9c-4b8d-9958-51fbade0900a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-x8pb6\" (UID: \"499f22b6-5f9c-4b8d-9958-51fbade0900a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:51.901423 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.900993 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:51.998847 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:51.997312 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" podStartSLOduration=1.9972916440000001 podStartE2EDuration="1.997291644s" podCreationTimestamp="2026-04-24 23:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:51.291586323 +0000 UTC m=+69.896312920" watchObservedRunningTime="2026-04-24 23:54:51.997291644 +0000 UTC m=+70.602018243" Apr 24 23:54:52.074991 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:52.074961 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6"] Apr 24 23:54:52.207226 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:54:52.207188 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499f22b6_5f9c_4b8d_9958_51fbade0900a.slice/crio-c6ade435f9d909495629a04312c86979dd16198fab8105be226ecf9f1954061f WatchSource:0}: Error finding container c6ade435f9d909495629a04312c86979dd16198fab8105be226ecf9f1954061f: Status 404 returned error can't find the container with id c6ade435f9d909495629a04312c86979dd16198fab8105be226ecf9f1954061f Apr 24 23:54:52.271093 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:52.271058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" event={"ID":"499f22b6-5f9c-4b8d-9958-51fbade0900a","Type":"ContainerStarted","Data":"c6ade435f9d909495629a04312c86979dd16198fab8105be226ecf9f1954061f"} Apr 24 23:54:53.157312 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:53.156512 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lsbf2" Apr 24 23:54:53.280512 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:53.280456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7xqlq" event={"ID":"4d74dd05-34d6-465d-b59b-4694e782b05f","Type":"ContainerStarted","Data":"7638a2b322b91e04f3649d3b8b68e4a93506294d8bb83d9e59e869a6adc3082b"} Apr 24 23:54:58.298089 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.298035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5695878b76-x468z" event={"ID":"37d2d963-9753-4296-ab1e-ae0df5ec3511","Type":"ContainerStarted","Data":"b20afbdd6199544e877dea2505c6299c6409208bc7f37d4040ee325e88d99bef"} Apr 24 23:54:58.299520 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.299485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" event={"ID":"e969b2dc-2e72-4829-b6e2-ba78de037eaf","Type":"ContainerStarted","Data":"f033fbed85c27e5845282fd9206eef825d7e03e8d944e3ad66b2ef974c4e9623"} Apr 24 23:54:58.301475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.301447 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7xqlq" event={"ID":"4d74dd05-34d6-465d-b59b-4694e782b05f","Type":"ContainerStarted","Data":"1a80ea5838e363022e72dfe93de8bb29f2d79373b0e00e494feca6bfdc39f15b"} Apr 24 23:54:58.302816 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.302793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" event={"ID":"84fefcad-12b8-4f83-93a4-280b95fa779c","Type":"ContainerStarted","Data":"31e3a20234c680ca9e8815542dfa587c1abd39c7d0ea5c3dd3dcbb2c1d5696f1"} Apr 24 23:54:58.302987 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.302970 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:58.304205 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.304178 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" event={"ID":"499f22b6-5f9c-4b8d-9958-51fbade0900a","Type":"ContainerStarted","Data":"db8e61165d21f342f6b105231c986fdf32081d828608ea0d6b3583e546a0c1d4"} Apr 24 23:54:58.304489 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.304468 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:58.304815 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.304795 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" Apr 24 23:54:58.309706 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.309688 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" Apr 24 23:54:58.316571 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.316528 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5695878b76-x468z" podStartSLOduration=2.135761768 podStartE2EDuration="9.316513725s" podCreationTimestamp="2026-04-24 23:54:49 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.5179999 +0000 UTC m=+69.122726477" lastFinishedPulling="2026-04-24 23:54:57.698751858 +0000 UTC m=+76.303478434" observedRunningTime="2026-04-24 23:54:58.315550268 +0000 UTC m=+76.920276867" watchObservedRunningTime="2026-04-24 23:54:58.316513725 +0000 UTC m=+76.921240323" Apr 24 23:54:58.331416 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.331365 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-x8pb6" podStartSLOduration=2.812261951 podStartE2EDuration="8.331354366s" podCreationTimestamp="2026-04-24 23:54:50 +0000 UTC" firstStartedPulling="2026-04-24 23:54:52.235152247 +0000 UTC m=+70.839878823" lastFinishedPulling="2026-04-24 23:54:57.754244659 +0000 UTC m=+76.358971238" observedRunningTime="2026-04-24 23:54:58.329694259 +0000 UTC m=+76.934420858" watchObservedRunningTime="2026-04-24 23:54:58.331354366 +0000 UTC m=+76.936080964" Apr 24 23:54:58.349840 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.349791 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57dcbdb694-vtnjj" podStartSLOduration=2.07037743 podStartE2EDuration="9.34977537s" podCreationTimestamp="2026-04-24 23:54:49 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.4192985 +0000 UTC m=+69.024025082" lastFinishedPulling="2026-04-24 23:54:57.698696436 +0000 UTC m=+76.303423022" observedRunningTime="2026-04-24 23:54:58.349050611 +0000 UTC m=+76.953777212" watchObservedRunningTime="2026-04-24 23:54:58.34977537 +0000 UTC m=+76.954501968" Apr 24 23:54:58.367520 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.367472 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7xqlq" podStartSLOduration=1.348589149 podStartE2EDuration="8.367459921s" podCreationTimestamp="2026-04-24 23:54:50 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.733776431 +0000 UTC m=+69.338503008" lastFinishedPulling="2026-04-24 23:54:57.752647192 +0000 UTC m=+76.357373780" observedRunningTime="2026-04-24 23:54:58.36688803 +0000 UTC m=+76.971614629" watchObservedRunningTime="2026-04-24 23:54:58.367459921 +0000 UTC m=+76.972186531" Apr 24 23:54:58.382481 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:54:58.382441 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58cbcbff7-ft9x9" podStartSLOduration=2.127838756 podStartE2EDuration="9.382431319s" podCreationTimestamp="2026-04-24 23:54:49 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.443894275 +0000 UTC m=+69.048620866" lastFinishedPulling="2026-04-24 23:54:57.698486842 +0000 UTC m=+76.303213429" observedRunningTime="2026-04-24 23:54:58.381071276 +0000 UTC m=+76.985797876" watchObservedRunningTime="2026-04-24 23:54:58.382431319 +0000 UTC m=+76.987157916" Apr 24 23:55:00.357659 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:00.357627 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5695878b76-x468z" Apr 24 23:55:00.358142 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:00.357669 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5695878b76-x468z" Apr 24 23:55:00.359207 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:00.359176 2569 patch_prober.go:28] interesting pod/console-5695878b76-x468z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.13:8443/health\": dial tcp 10.133.0.13:8443: connect: connection refused" start-of-body= Apr 24 23:55:00.359344 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:00.359227 2569 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5695878b76-x468z" podUID="37d2d963-9753-4296-ab1e-ae0df5ec3511" containerName="console" probeResult="failure" output="Get \"https://10.133.0.13:8443/health\": dial tcp 10.133.0.13:8443: connect: connection refused" Apr 24 23:55:01.279797 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:01.279670 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ckvqn" Apr 24 23:55:03.433905 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.433705 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bdlrw"] Apr 24 23:55:03.438587 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.438562 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.446075 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.445088 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:55:03.446075 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.445481 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:55:03.446075 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.445714 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:55:03.448141 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.447908 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:55:03.451568 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.450483 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:55:03.454579 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.452250 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6rtvs\"" Apr 24 23:55:03.454579 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.452965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:55:03.485152 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485059 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-textfile\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485152 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-sys\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-tls\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485295 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-accelerators-collector-config\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-root\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-wtmp\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485386 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzpk\" (UniqueName: \"kubernetes.io/projected/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-kube-api-access-zxzpk\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.485596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.485433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-metrics-client-ca\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.586835 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-sys\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.586890 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-tls\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.586926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-accelerators-collector-config\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.586963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-root\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.586988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-wtmp\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.586999 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-sys\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzpk\" (UniqueName: \"kubernetes.io/projected/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-kube-api-access-zxzpk\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:55:03.587071 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:55:03.587143 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-tls podName:b67d7b0e-0fdf-4585-a96e-98063b80e4c3 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:04.087121363 +0000 UTC m=+82.691847943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-tls") pod "node-exporter-bdlrw" (UID: "b67d7b0e-0fdf-4585-a96e-98063b80e4c3") : secret "node-exporter-tls" not found Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-metrics-client-ca\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587324 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-textfile\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.587421 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.588135 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-metrics-client-ca\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.588135 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-accelerators-collector-config\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.588135 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-root\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.588135 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.587933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-wtmp\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.588135 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.588074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-textfile\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.598693 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.590464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:03.608968 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:03.601384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzpk\" (UniqueName: \"kubernetes.io/projected/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-kube-api-access-zxzpk\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:04.093099 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:04.093067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-tls\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:04.095645 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:04.095616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b67d7b0e-0fdf-4585-a96e-98063b80e4c3-node-exporter-tls\") pod \"node-exporter-bdlrw\" (UID: \"b67d7b0e-0fdf-4585-a96e-98063b80e4c3\") " pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:04.358353 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:04.358274 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bdlrw" Apr 24 23:55:08.047161 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:55:08.047092 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67d7b0e_0fdf_4585_a96e_98063b80e4c3.slice/crio-d1289fb21218828f83ad06decd165b9c75e3c2257b65fbdc4c02ef305e1abf68 WatchSource:0}: Error finding container d1289fb21218828f83ad06decd165b9c75e3c2257b65fbdc4c02ef305e1abf68: Status 404 returned error can't find the container with id d1289fb21218828f83ad06decd165b9c75e3c2257b65fbdc4c02ef305e1abf68 Apr 24 23:55:08.259567 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:08.259533 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5695878b76-x468z"] Apr 24 23:55:08.334899 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:08.334808 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bdlrw" event={"ID":"b67d7b0e-0fdf-4585-a96e-98063b80e4c3","Type":"ContainerStarted","Data":"d1289fb21218828f83ad06decd165b9c75e3c2257b65fbdc4c02ef305e1abf68"} Apr 24 23:55:08.336449 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:08.336393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ddrd5" event={"ID":"582002e5-6e19-4c7c-afa8-2c680db672f4","Type":"ContainerStarted","Data":"4ceb9d65900277b915f3ff6a77c703e5c73360f61dc966eb23db25c3561a6a63"} Apr 24 23:55:08.337109 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:08.336800 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:55:08.338068 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:08.338040 2569 patch_prober.go:28] interesting pod/downloads-6bcc868b7-ddrd5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.14:8080/\": dial tcp 10.133.0.14:8080: connect: connection refused" start-of-body= Apr 24 23:55:08.338146 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:08.338096 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-ddrd5" podUID="582002e5-6e19-4c7c-afa8-2c680db672f4" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.14:8080/\": dial tcp 10.133.0.14:8080: connect: connection refused" Apr 24 23:55:09.341162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:09.341117 2569 generic.go:358] "Generic (PLEG): container finished" podID="b67d7b0e-0fdf-4585-a96e-98063b80e4c3" containerID="e447e523b6a75b0a8be99e555ae98f570776971a9ca05c779ff358f25686c378" exitCode=0 Apr 24 23:55:09.341672 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:09.341213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bdlrw" event={"ID":"b67d7b0e-0fdf-4585-a96e-98063b80e4c3","Type":"ContainerDied","Data":"e447e523b6a75b0a8be99e555ae98f570776971a9ca05c779ff358f25686c378"} Apr 24 23:55:09.352368 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:09.352348 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ddrd5" Apr 24 23:55:09.362420 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:09.362274 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ddrd5" podStartSLOduration=1.780248284 podStartE2EDuration="19.362257142s" podCreationTimestamp="2026-04-24 23:54:50 +0000 UTC" firstStartedPulling="2026-04-24 23:54:50.553664795 +0000 UTC m=+69.158391371" lastFinishedPulling="2026-04-24 23:55:08.135673639 +0000 UTC m=+86.740400229" observedRunningTime="2026-04-24 23:55:08.352215606 +0000 UTC m=+86.956942205" watchObservedRunningTime="2026-04-24 23:55:09.362257142 +0000 UTC m=+87.966983741" Apr 24 23:55:10.067934 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:10.067903 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:55:10.347234 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:10.347160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bdlrw" event={"ID":"b67d7b0e-0fdf-4585-a96e-98063b80e4c3","Type":"ContainerStarted","Data":"9a9c96713561c11b23bd29e634c4d5f39c9054496c14755ca8eb4343d8769c98"} Apr 24 23:55:10.347234 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:10.347198 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bdlrw" event={"ID":"b67d7b0e-0fdf-4585-a96e-98063b80e4c3","Type":"ContainerStarted","Data":"2bbb74fe890510b8b2380f6e7102c0fcddc3fa57db891adcd57e359fb5bcde36"} Apr 24 23:55:10.384365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:10.384312 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bdlrw" podStartSLOduration=6.568450852 podStartE2EDuration="7.384297853s" podCreationTimestamp="2026-04-24 23:55:03 +0000 UTC" firstStartedPulling="2026-04-24 23:55:08.049012862 +0000 UTC m=+86.653739438" lastFinishedPulling="2026-04-24 23:55:08.864859848 +0000 UTC m=+87.469586439" observedRunningTime="2026-04-24 23:55:10.376090825 +0000 UTC m=+88.980817443" watchObservedRunningTime="2026-04-24 23:55:10.384297853 +0000 UTC m=+88.989024445" Apr 24 23:55:13.285864 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:13.285834 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76d68c7b68-hgbqp" Apr 24 23:55:15.086709 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.086633 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-779c988769-4lp9b" podUID="b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" containerName="registry" containerID="cri-o://6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599" gracePeriod=30 Apr 24 23:55:15.352559 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.352530 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:55:15.364874 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.364840 2569 generic.go:358] "Generic (PLEG): container finished" podID="b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" containerID="6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599" exitCode=0 Apr 24 23:55:15.365006 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.364902 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c988769-4lp9b" Apr 24 23:55:15.365006 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.364925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c988769-4lp9b" event={"ID":"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7","Type":"ContainerDied","Data":"6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599"} Apr 24 23:55:15.365006 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.364972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c988769-4lp9b" event={"ID":"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7","Type":"ContainerDied","Data":"852658d691f413501fca9051d2bca665fe1eecf57ca5f19c62b95517e21ae0af"} Apr 24 23:55:15.365006 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.364994 2569 scope.go:117] "RemoveContainer" containerID="6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599" Apr 24 23:55:15.374133 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.374111 2569 scope.go:117] "RemoveContainer" containerID="6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599" Apr 24 23:55:15.374572 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:55:15.374482 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599\": container with ID starting with 6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599 not found: ID does not exist" containerID="6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599" Apr 24 23:55:15.374572 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.374520 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599"} err="failed to get container status \"6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599\": rpc error: code = NotFound desc = could not find container \"6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599\": container with ID starting with 6af9c236d96d16a2f221508ea445ca783f96e8038f787d80b357da562c944599 not found: ID does not exist" Apr 24 23:55:15.395200 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395166 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-certificates\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395200 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395200 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-installation-pull-secrets\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395380 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395225 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-bound-sa-token\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395380 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395266 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-trusted-ca\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395380 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395307 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395382 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-image-registry-private-configuration\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395440 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-ca-trust-extracted\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395596 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395467 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hhjv\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-kube-api-access-4hhjv\") pod \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\" (UID: \"b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7\") " Apr 24 23:55:15.395748 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395586 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:15.395748 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395708 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-certificates\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.395856 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.395741 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:15.398365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.398318 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-kube-api-access-4hhjv" (OuterVolumeSpecName: "kube-api-access-4hhjv") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "kube-api-access-4hhjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:15.398488 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.398377 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:15.398488 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.398383 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:15.398488 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.398453 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:15.399015 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.398985 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:15.407146 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.407114 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" (UID: "b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:55:15.496536 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496497 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-image-registry-private-configuration\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.496536 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496534 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-ca-trust-extracted\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.496763 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496549 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hhjv\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-kube-api-access-4hhjv\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.496763 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496560 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-installation-pull-secrets\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.496763 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496570 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-bound-sa-token\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.496763 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496579 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-trusted-ca\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.496763 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.496587 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7-registry-tls\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:15.690359 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.690325 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779c988769-4lp9b"] Apr 24 23:55:15.693506 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.693475 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-779c988769-4lp9b"] Apr 24 23:55:15.977584 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:15.977513 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" path="/var/lib/kubelet/pods/b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7/volumes" Apr 24 23:55:16.053721 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.053686 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5564cf54bd-q2dz5"] Apr 24 23:55:16.054037 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.054021 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" containerName="registry" Apr 24 23:55:16.054037 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.054038 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" containerName="registry" Apr 24 23:55:16.054147 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.054120 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2ca4df1-8494-46e1-b4e0-c81aba8ca1a7" containerName="registry" Apr 24 23:55:16.075619 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.075594 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5564cf54bd-q2dz5"] Apr 24 23:55:16.075771 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.075715 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.092330 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.092295 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 23:55:16.202090 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kdgz\" (UniqueName: \"kubernetes.io/projected/c046fff7-da02-4064-9979-ea6734ee1d6c-kube-api-access-8kdgz\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.202311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202119 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-oauth-serving-cert\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.202311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-trusted-ca-bundle\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.202311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-service-ca\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.202311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202231 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-oauth-config\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.202311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-serving-cert\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.202601 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.202341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-console-config\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303421 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kdgz\" (UniqueName: \"kubernetes.io/projected/c046fff7-da02-4064-9979-ea6734ee1d6c-kube-api-access-8kdgz\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303677 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303489 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-oauth-serving-cert\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303677 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-trusted-ca-bundle\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303677 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-service-ca\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303677 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-oauth-config\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303677 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303637 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-serving-cert\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.303677 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.303672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-console-config\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.304219 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.304194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-oauth-serving-cert\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.304462 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.304442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-trusted-ca-bundle\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.306448 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.306424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-oauth-config\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.306629 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.306608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-serving-cert\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.312384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.312325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-service-ca\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.312384 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.312325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-console-config\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.315429 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.315386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kdgz\" (UniqueName: \"kubernetes.io/projected/c046fff7-da02-4064-9979-ea6734ee1d6c-kube-api-access-8kdgz\") pod \"console-5564cf54bd-q2dz5\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.386770 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.386735 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:16.524905 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:16.524879 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5564cf54bd-q2dz5"] Apr 24 23:55:16.527602 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:55:16.527553 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc046fff7_da02_4064_9979_ea6734ee1d6c.slice/crio-5bd8c3ad6a9dc7d3d6f6eadd2f437432aa2aa0e87d9665d4990febcb0a45391f WatchSource:0}: Error finding container 5bd8c3ad6a9dc7d3d6f6eadd2f437432aa2aa0e87d9665d4990febcb0a45391f: Status 404 returned error can't find the container with id 5bd8c3ad6a9dc7d3d6f6eadd2f437432aa2aa0e87d9665d4990febcb0a45391f Apr 24 23:55:17.373629 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:17.373551 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cf54bd-q2dz5" event={"ID":"c046fff7-da02-4064-9979-ea6734ee1d6c","Type":"ContainerStarted","Data":"379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad"} Apr 24 23:55:17.373629 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:17.373603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cf54bd-q2dz5" event={"ID":"c046fff7-da02-4064-9979-ea6734ee1d6c","Type":"ContainerStarted","Data":"5bd8c3ad6a9dc7d3d6f6eadd2f437432aa2aa0e87d9665d4990febcb0a45391f"} Apr 24 23:55:17.395576 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:17.393326 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5564cf54bd-q2dz5" podStartSLOduration=1.393308208 podStartE2EDuration="1.393308208s" podCreationTimestamp="2026-04-24 23:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:55:17.390815657 +0000 UTC m=+95.995542252" watchObservedRunningTime="2026-04-24 23:55:17.393308208 +0000 UTC m=+95.998034808" Apr 24 23:55:26.387577 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:26.387540 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:26.387577 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:26.387584 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:26.392353 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:26.392332 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:26.400161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:26.400138 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:55:33.280110 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.280065 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5695878b76-x468z" podUID="37d2d963-9753-4296-ab1e-ae0df5ec3511" containerName="console" containerID="cri-o://b20afbdd6199544e877dea2505c6299c6409208bc7f37d4040ee325e88d99bef" gracePeriod=15 Apr 24 23:55:33.415240 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.415215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5695878b76-x468z_37d2d963-9753-4296-ab1e-ae0df5ec3511/console/0.log" Apr 24 23:55:33.415435 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.415252 2569 generic.go:358] "Generic (PLEG): container finished" podID="37d2d963-9753-4296-ab1e-ae0df5ec3511" containerID="b20afbdd6199544e877dea2505c6299c6409208bc7f37d4040ee325e88d99bef" exitCode=2 Apr 24 23:55:33.415435 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.415286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5695878b76-x468z" event={"ID":"37d2d963-9753-4296-ab1e-ae0df5ec3511","Type":"ContainerDied","Data":"b20afbdd6199544e877dea2505c6299c6409208bc7f37d4040ee325e88d99bef"} Apr 24 23:55:33.540249 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.540190 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5695878b76-x468z_37d2d963-9753-4296-ab1e-ae0df5ec3511/console/0.log" Apr 24 23:55:33.540356 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.540253 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5695878b76-x468z" Apr 24 23:55:33.638196 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638161 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-service-ca\") pod \"37d2d963-9753-4296-ab1e-ae0df5ec3511\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " Apr 24 23:55:33.638348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638228 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-oauth-serving-cert\") pod \"37d2d963-9753-4296-ab1e-ae0df5ec3511\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " Apr 24 23:55:33.638348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638263 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-serving-cert\") pod \"37d2d963-9753-4296-ab1e-ae0df5ec3511\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " Apr 24 23:55:33.638348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638325 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghqgb\" (UniqueName: \"kubernetes.io/projected/37d2d963-9753-4296-ab1e-ae0df5ec3511-kube-api-access-ghqgb\") pod \"37d2d963-9753-4296-ab1e-ae0df5ec3511\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " Apr 24 23:55:33.638503 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638358 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-oauth-config\") pod \"37d2d963-9753-4296-ab1e-ae0df5ec3511\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " Apr 24 23:55:33.638503 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638387 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-config\") pod \"37d2d963-9753-4296-ab1e-ae0df5ec3511\" (UID: \"37d2d963-9753-4296-ab1e-ae0df5ec3511\") " Apr 24 23:55:33.638609 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638574 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-service-ca" (OuterVolumeSpecName: "service-ca") pod "37d2d963-9753-4296-ab1e-ae0df5ec3511" (UID: "37d2d963-9753-4296-ab1e-ae0df5ec3511"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:33.638671 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638644 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-service-ca\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:33.638727 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638701 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "37d2d963-9753-4296-ab1e-ae0df5ec3511" (UID: "37d2d963-9753-4296-ab1e-ae0df5ec3511"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:33.638901 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.638871 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-config" (OuterVolumeSpecName: "console-config") pod "37d2d963-9753-4296-ab1e-ae0df5ec3511" (UID: "37d2d963-9753-4296-ab1e-ae0df5ec3511"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:33.640651 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.640624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d2d963-9753-4296-ab1e-ae0df5ec3511-kube-api-access-ghqgb" (OuterVolumeSpecName: "kube-api-access-ghqgb") pod "37d2d963-9753-4296-ab1e-ae0df5ec3511" (UID: "37d2d963-9753-4296-ab1e-ae0df5ec3511"). InnerVolumeSpecName "kube-api-access-ghqgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:33.640870 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.640853 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "37d2d963-9753-4296-ab1e-ae0df5ec3511" (UID: "37d2d963-9753-4296-ab1e-ae0df5ec3511"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:33.640940 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.640889 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "37d2d963-9753-4296-ab1e-ae0df5ec3511" (UID: "37d2d963-9753-4296-ab1e-ae0df5ec3511"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:33.739162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.739126 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-serving-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:33.739162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.739157 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghqgb\" (UniqueName: \"kubernetes.io/projected/37d2d963-9753-4296-ab1e-ae0df5ec3511-kube-api-access-ghqgb\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:33.739162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.739167 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-oauth-config\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:33.739389 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.739176 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-console-config\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:33.739389 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:33.739186 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37d2d963-9753-4296-ab1e-ae0df5ec3511-oauth-serving-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:55:34.419422 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:34.419382 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5695878b76-x468z_37d2d963-9753-4296-ab1e-ae0df5ec3511/console/0.log" Apr 24 23:55:34.419836 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:34.419542 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5695878b76-x468z" Apr 24 23:55:34.419836 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:34.419534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5695878b76-x468z" event={"ID":"37d2d963-9753-4296-ab1e-ae0df5ec3511","Type":"ContainerDied","Data":"09cef4c7b73d6170ff3ec9b9d2a90a0ae3bcc80acae45ec0d7d24b06e36b7484"} Apr 24 23:55:34.419836 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:34.419583 2569 scope.go:117] "RemoveContainer" containerID="b20afbdd6199544e877dea2505c6299c6409208bc7f37d4040ee325e88d99bef" Apr 24 23:55:34.436512 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:34.436488 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5695878b76-x468z"] Apr 24 23:55:34.438995 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:34.438971 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5695878b76-x468z"] Apr 24 23:55:35.976279 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:55:35.976245 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d2d963-9753-4296-ab1e-ae0df5ec3511" path="/var/lib/kubelet/pods/37d2d963-9753-4296-ab1e-ae0df5ec3511/volumes" Apr 24 23:56:39.939046 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:39.938958 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-786d889dfb-86rvz"] Apr 24 23:56:39.939529 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:39.939223 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37d2d963-9753-4296-ab1e-ae0df5ec3511" containerName="console" Apr 24 23:56:39.939529 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:39.939234 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d2d963-9753-4296-ab1e-ae0df5ec3511" containerName="console" Apr 24 23:56:39.939529 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:39.939297 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="37d2d963-9753-4296-ab1e-ae0df5ec3511" containerName="console" Apr 24 23:56:39.942031 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:39.942014 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:39.955978 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:39.955950 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-786d889dfb-86rvz"] Apr 24 23:56:40.008194 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008157 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-config\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.008194 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008202 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-service-ca\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.008428 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008280 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g92c\" (UniqueName: \"kubernetes.io/projected/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-kube-api-access-4g92c\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.008428 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008313 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-oauth-serving-cert\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.008428 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-oauth-config\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.008568 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-serving-cert\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.008608 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.008588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-trusted-ca-bundle\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109455 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-config\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109455 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-service-ca\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109643 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g92c\" (UniqueName: \"kubernetes.io/projected/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-kube-api-access-4g92c\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109643 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-oauth-serving-cert\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-oauth-config\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109754 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-serving-cert\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.109754 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.109748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-trusted-ca-bundle\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.110148 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.110114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-service-ca\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.110297 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.110235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-config\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.110297 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.110256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-oauth-serving-cert\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.110697 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.110681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-trusted-ca-bundle\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.112163 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.112138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-oauth-config\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.112289 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.112274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-serving-cert\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.118475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.118449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g92c\" (UniqueName: \"kubernetes.io/projected/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-kube-api-access-4g92c\") pod \"console-786d889dfb-86rvz\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.250733 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.250702 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:40.573872 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.573849 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-786d889dfb-86rvz"] Apr 24 23:56:40.577104 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:56:40.577075 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a33cb9a_3c1d_4e67_b523_b81a49b1de4e.slice/crio-46f0d05e361bdab10d28bf9f3e87249011564e7c7a4e4587fd2518a2616edc7c WatchSource:0}: Error finding container 46f0d05e361bdab10d28bf9f3e87249011564e7c7a4e4587fd2518a2616edc7c: Status 404 returned error can't find the container with id 46f0d05e361bdab10d28bf9f3e87249011564e7c7a4e4587fd2518a2616edc7c Apr 24 23:56:40.594605 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:40.594576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-786d889dfb-86rvz" event={"ID":"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e","Type":"ContainerStarted","Data":"46f0d05e361bdab10d28bf9f3e87249011564e7c7a4e4587fd2518a2616edc7c"} Apr 24 23:56:41.597910 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:41.597876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-786d889dfb-86rvz" event={"ID":"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e","Type":"ContainerStarted","Data":"1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b"} Apr 24 23:56:41.617191 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:41.617146 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-786d889dfb-86rvz" podStartSLOduration=2.617132594 podStartE2EDuration="2.617132594s" podCreationTimestamp="2026-04-24 23:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:41.615118079 +0000 UTC m=+180.219844677" watchObservedRunningTime="2026-04-24 23:56:41.617132594 +0000 UTC m=+180.221859192" Apr 24 23:56:50.251014 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:50.250980 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:50.251512 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:50.251025 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:50.255632 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:50.255608 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:50.625505 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:50.625429 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-786d889dfb-86rvz" Apr 24 23:56:50.674498 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:56:50.674465 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5564cf54bd-q2dz5"] Apr 24 23:57:14.015602 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.015569 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs"] Apr 24 23:57:14.018828 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.018811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.021339 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.021313 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:57:14.021476 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.021347 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:57:14.022348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.022325 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7pf4m\"" Apr 24 23:57:14.027578 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.027558 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs"] Apr 24 23:57:14.167524 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.167476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.167524 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.167528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.167720 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.167584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w92np\" (UniqueName: \"kubernetes.io/projected/76ce9c06-e24a-457a-93b2-f4675149ec48-kube-api-access-w92np\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.268161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.268095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w92np\" (UniqueName: \"kubernetes.io/projected/76ce9c06-e24a-457a-93b2-f4675149ec48-kube-api-access-w92np\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.268161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.268153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.268294 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.268171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.268500 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.268485 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.268541 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.268521 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.277458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.277428 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w92np\" (UniqueName: \"kubernetes.io/projected/76ce9c06-e24a-457a-93b2-f4675149ec48-kube-api-access-w92np\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.327946 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.327924 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:14.445478 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.445441 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs"] Apr 24 23:57:14.448291 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:57:14.448262 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ce9c06_e24a_457a_93b2_f4675149ec48.slice/crio-0ad3ce6f5f57022dd599ccefaa515318c1c658711195956e6c2c7cd0496fcf79 WatchSource:0}: Error finding container 0ad3ce6f5f57022dd599ccefaa515318c1c658711195956e6c2c7cd0496fcf79: Status 404 returned error can't find the container with id 0ad3ce6f5f57022dd599ccefaa515318c1c658711195956e6c2c7cd0496fcf79 Apr 24 23:57:14.684759 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:14.684677 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" event={"ID":"76ce9c06-e24a-457a-93b2-f4675149ec48","Type":"ContainerStarted","Data":"0ad3ce6f5f57022dd599ccefaa515318c1c658711195956e6c2c7cd0496fcf79"} Apr 24 23:57:15.698026 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:15.697987 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5564cf54bd-q2dz5" podUID="c046fff7-da02-4064-9979-ea6734ee1d6c" containerName="console" containerID="cri-o://379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad" gracePeriod=15 Apr 24 23:57:15.945586 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:15.945566 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5564cf54bd-q2dz5_c046fff7-da02-4064-9979-ea6734ee1d6c/console/0.log" Apr 24 23:57:15.945722 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:15.945634 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:57:16.083521 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083487 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-trusted-ca-bundle\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.083718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083536 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kdgz\" (UniqueName: \"kubernetes.io/projected/c046fff7-da02-4064-9979-ea6734ee1d6c-kube-api-access-8kdgz\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.083718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083588 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-serving-cert\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.083718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083644 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-console-config\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.083718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083678 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-service-ca\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.083718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083708 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-oauth-serving-cert\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.083921 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.083735 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-oauth-config\") pod \"c046fff7-da02-4064-9979-ea6734ee1d6c\" (UID: \"c046fff7-da02-4064-9979-ea6734ee1d6c\") " Apr 24 23:57:16.084244 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.084216 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-console-config" (OuterVolumeSpecName: "console-config") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:16.084376 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.084287 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:16.084476 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.084378 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:16.084591 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.084565 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:16.085894 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.085872 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:16.086639 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.086613 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:16.086639 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.086625 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c046fff7-da02-4064-9979-ea6734ee1d6c-kube-api-access-8kdgz" (OuterVolumeSpecName: "kube-api-access-8kdgz") pod "c046fff7-da02-4064-9979-ea6734ee1d6c" (UID: "c046fff7-da02-4064-9979-ea6734ee1d6c"). InnerVolumeSpecName "kube-api-access-8kdgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:16.185259 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185219 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-console-config\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.185259 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185259 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-service-ca\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.185475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185273 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-oauth-serving-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.185475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185290 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-oauth-config\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.185475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185304 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c046fff7-da02-4064-9979-ea6734ee1d6c-trusted-ca-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.185475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185319 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kdgz\" (UniqueName: \"kubernetes.io/projected/c046fff7-da02-4064-9979-ea6734ee1d6c-kube-api-access-8kdgz\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.185475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.185331 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c046fff7-da02-4064-9979-ea6734ee1d6c-console-serving-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:16.691787 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.691758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5564cf54bd-q2dz5_c046fff7-da02-4064-9979-ea6734ee1d6c/console/0.log" Apr 24 23:57:16.691973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.691797 2569 generic.go:358] "Generic (PLEG): container finished" podID="c046fff7-da02-4064-9979-ea6734ee1d6c" containerID="379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad" exitCode=2 Apr 24 23:57:16.691973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.691834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cf54bd-q2dz5" event={"ID":"c046fff7-da02-4064-9979-ea6734ee1d6c","Type":"ContainerDied","Data":"379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad"} Apr 24 23:57:16.691973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.691871 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5564cf54bd-q2dz5" Apr 24 23:57:16.691973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.691881 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5564cf54bd-q2dz5" event={"ID":"c046fff7-da02-4064-9979-ea6734ee1d6c","Type":"ContainerDied","Data":"5bd8c3ad6a9dc7d3d6f6eadd2f437432aa2aa0e87d9665d4990febcb0a45391f"} Apr 24 23:57:16.691973 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.691903 2569 scope.go:117] "RemoveContainer" containerID="379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad" Apr 24 23:57:16.701206 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.700838 2569 scope.go:117] "RemoveContainer" containerID="379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad" Apr 24 23:57:16.701490 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:16.701333 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad\": container with ID starting with 379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad not found: ID does not exist" containerID="379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad" Apr 24 23:57:16.701490 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.701368 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad"} err="failed to get container status \"379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad\": rpc error: code = NotFound desc = could not find container \"379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad\": container with ID starting with 379ca10954071eae272bb035b16ee26ae42c337cfc0d7be6fb8534f5df31ffad not found: ID does not exist" Apr 24 23:57:16.715714 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.715690 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5564cf54bd-q2dz5"] Apr 24 23:57:16.720184 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:16.720161 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5564cf54bd-q2dz5"] Apr 24 23:57:17.977918 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:17.977880 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c046fff7-da02-4064-9979-ea6734ee1d6c" path="/var/lib/kubelet/pods/c046fff7-da02-4064-9979-ea6734ee1d6c/volumes" Apr 24 23:57:19.703511 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:19.703477 2569 generic.go:358] "Generic (PLEG): container finished" podID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerID="89718ae4a049f9caf18b49b17d69663c8cd4d8be04f67ddffe45731df358264e" exitCode=0 Apr 24 23:57:19.703893 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:19.703532 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" event={"ID":"76ce9c06-e24a-457a-93b2-f4675149ec48","Type":"ContainerDied","Data":"89718ae4a049f9caf18b49b17d69663c8cd4d8be04f67ddffe45731df358264e"} Apr 24 23:57:21.709604 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:21.709512 2569 generic.go:358] "Generic (PLEG): container finished" podID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerID="bc0dc17706ec6198aa801c9d7a76324963945e2da8703f6cfcb71216150dd36f" exitCode=0 Apr 24 23:57:21.709604 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:21.709570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" event={"ID":"76ce9c06-e24a-457a-93b2-f4675149ec48","Type":"ContainerDied","Data":"bc0dc17706ec6198aa801c9d7a76324963945e2da8703f6cfcb71216150dd36f"} Apr 24 23:57:28.730935 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:28.730849 2569 generic.go:358] "Generic (PLEG): container finished" podID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerID="0d8c56be2987b11d24c7af84819de4598391d443ed8a0ba8c0a91f27bb34b6cc" exitCode=0 Apr 24 23:57:28.730935 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:28.730926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" event={"ID":"76ce9c06-e24a-457a-93b2-f4675149ec48","Type":"ContainerDied","Data":"0d8c56be2987b11d24c7af84819de4598391d443ed8a0ba8c0a91f27bb34b6cc"} Apr 24 23:57:29.845255 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.845233 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:29.993211 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.993114 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-util\") pod \"76ce9c06-e24a-457a-93b2-f4675149ec48\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " Apr 24 23:57:29.993211 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.993161 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w92np\" (UniqueName: \"kubernetes.io/projected/76ce9c06-e24a-457a-93b2-f4675149ec48-kube-api-access-w92np\") pod \"76ce9c06-e24a-457a-93b2-f4675149ec48\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " Apr 24 23:57:29.993211 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.993194 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-bundle\") pod \"76ce9c06-e24a-457a-93b2-f4675149ec48\" (UID: \"76ce9c06-e24a-457a-93b2-f4675149ec48\") " Apr 24 23:57:29.993766 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.993742 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-bundle" (OuterVolumeSpecName: "bundle") pod "76ce9c06-e24a-457a-93b2-f4675149ec48" (UID: "76ce9c06-e24a-457a-93b2-f4675149ec48"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:29.995362 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.995333 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ce9c06-e24a-457a-93b2-f4675149ec48-kube-api-access-w92np" (OuterVolumeSpecName: "kube-api-access-w92np") pod "76ce9c06-e24a-457a-93b2-f4675149ec48" (UID: "76ce9c06-e24a-457a-93b2-f4675149ec48"). InnerVolumeSpecName "kube-api-access-w92np". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:29.997343 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:29.997322 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-util" (OuterVolumeSpecName: "util") pod "76ce9c06-e24a-457a-93b2-f4675149ec48" (UID: "76ce9c06-e24a-457a-93b2-f4675149ec48"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:30.094172 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:30.094134 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:30.094172 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:30.094163 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w92np\" (UniqueName: \"kubernetes.io/projected/76ce9c06-e24a-457a-93b2-f4675149ec48-kube-api-access-w92np\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:30.094172 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:30.094172 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76ce9c06-e24a-457a-93b2-f4675149ec48-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:57:30.737479 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:30.737436 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" event={"ID":"76ce9c06-e24a-457a-93b2-f4675149ec48","Type":"ContainerDied","Data":"0ad3ce6f5f57022dd599ccefaa515318c1c658711195956e6c2c7cd0496fcf79"} Apr 24 23:57:30.737479 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:30.737482 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad3ce6f5f57022dd599ccefaa515318c1c658711195956e6c2c7cd0496fcf79" Apr 24 23:57:30.737680 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:30.737499 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5ddjs" Apr 24 23:57:35.850015 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.849981 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt"] Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850237 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="pull" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850248 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="pull" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850264 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="util" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850269 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="util" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850275 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="extract" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850281 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="extract" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850290 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c046fff7-da02-4064-9979-ea6734ee1d6c" containerName="console" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850295 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c046fff7-da02-4064-9979-ea6734ee1d6c" containerName="console" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850332 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="76ce9c06-e24a-457a-93b2-f4675149ec48" containerName="extract" Apr 24 23:57:35.850436 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.850341 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c046fff7-da02-4064-9979-ea6734ee1d6c" containerName="console" Apr 24 23:57:35.855442 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.855423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:35.857999 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.857965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 23:57:35.858124 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.858079 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 23:57:35.858124 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.858092 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 23:57:35.858211 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.858168 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-zl994\"" Apr 24 23:57:35.863513 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:35.863491 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt"] Apr 24 23:57:36.030686 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.030645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ca143fa9-2d5a-47d7-a85e-2ebcb0350671-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-chrxt\" (UID: \"ca143fa9-2d5a-47d7-a85e-2ebcb0350671\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.030686 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.030694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59qmg\" (UniqueName: \"kubernetes.io/projected/ca143fa9-2d5a-47d7-a85e-2ebcb0350671-kube-api-access-59qmg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-chrxt\" (UID: \"ca143fa9-2d5a-47d7-a85e-2ebcb0350671\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.132078 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.131999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ca143fa9-2d5a-47d7-a85e-2ebcb0350671-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-chrxt\" (UID: \"ca143fa9-2d5a-47d7-a85e-2ebcb0350671\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.132078 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.132046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59qmg\" (UniqueName: \"kubernetes.io/projected/ca143fa9-2d5a-47d7-a85e-2ebcb0350671-kube-api-access-59qmg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-chrxt\" (UID: \"ca143fa9-2d5a-47d7-a85e-2ebcb0350671\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.134277 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.134257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ca143fa9-2d5a-47d7-a85e-2ebcb0350671-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-chrxt\" (UID: \"ca143fa9-2d5a-47d7-a85e-2ebcb0350671\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.140218 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.140195 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59qmg\" (UniqueName: \"kubernetes.io/projected/ca143fa9-2d5a-47d7-a85e-2ebcb0350671-kube-api-access-59qmg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-chrxt\" (UID: \"ca143fa9-2d5a-47d7-a85e-2ebcb0350671\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.165800 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.165774 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:36.284877 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.284845 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt"] Apr 24 23:57:36.287764 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:57:36.287738 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca143fa9_2d5a_47d7_a85e_2ebcb0350671.slice/crio-392b5bf353af7be93d1275d947da490afd893e88ebe569b5b6cd5637507f0ad2 WatchSource:0}: Error finding container 392b5bf353af7be93d1275d947da490afd893e88ebe569b5b6cd5637507f0ad2: Status 404 returned error can't find the container with id 392b5bf353af7be93d1275d947da490afd893e88ebe569b5b6cd5637507f0ad2 Apr 24 23:57:36.760843 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:36.760807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" event={"ID":"ca143fa9-2d5a-47d7-a85e-2ebcb0350671","Type":"ContainerStarted","Data":"392b5bf353af7be93d1275d947da490afd893e88ebe569b5b6cd5637507f0ad2"} Apr 24 23:57:40.689963 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.689932 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5kdns"] Apr 24 23:57:40.693190 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.693170 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.701230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.695896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 23:57:40.701230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.695997 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 23:57:40.701230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.696126 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5rxdj\"" Apr 24 23:57:40.702587 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.702416 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5kdns"] Apr 24 23:57:40.775803 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.775762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" event={"ID":"ca143fa9-2d5a-47d7-a85e-2ebcb0350671","Type":"ContainerStarted","Data":"701f97f36f68bfff892fa791fc1a76da44cf31957fbeaf12850739cfd34bc3a6"} Apr 24 23:57:40.775971 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.775954 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:57:40.805120 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.805054 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" podStartSLOduration=1.92916625 podStartE2EDuration="5.805036941s" podCreationTimestamp="2026-04-24 23:57:35 +0000 UTC" firstStartedPulling="2026-04-24 23:57:36.289383814 +0000 UTC m=+234.894110390" lastFinishedPulling="2026-04-24 23:57:40.165254502 +0000 UTC m=+238.769981081" observedRunningTime="2026-04-24 23:57:40.803894525 +0000 UTC m=+239.408621122" watchObservedRunningTime="2026-04-24 23:57:40.805036941 +0000 UTC m=+239.409763542" Apr 24 23:57:40.869179 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.869150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2dp\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-kube-api-access-mt2dp\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.869349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.869224 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ad3472c8-7725-4780-8472-54081a48c048-cabundle0\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.869349 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.869277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.969830 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.969746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ad3472c8-7725-4780-8472-54081a48c048-cabundle0\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.969830 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.969776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.969830 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.969806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2dp\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-kube-api-access-mt2dp\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.970117 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:40.969891 2569 secret.go:281] references non-existent secret key: ca.crt Apr 24 23:57:40.970117 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:40.969907 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 23:57:40.970117 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:40.969915 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5kdns: references non-existent secret key: ca.crt Apr 24 23:57:40.970117 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:40.969967 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates podName:ad3472c8-7725-4780-8472-54081a48c048 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:41.469950925 +0000 UTC m=+240.074677501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates") pod "keda-operator-ffbb595cb-5kdns" (UID: "ad3472c8-7725-4780-8472-54081a48c048") : references non-existent secret key: ca.crt Apr 24 23:57:40.970526 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.970504 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ad3472c8-7725-4780-8472-54081a48c048-cabundle0\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:40.979254 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:40.979235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2dp\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-kube-api-access-mt2dp\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:41.052306 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.052275 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq"] Apr 24 23:57:41.055306 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.055292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.057569 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.057548 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 23:57:41.063303 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.063283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq"] Apr 24 23:57:41.172082 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.172053 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-859th\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-kube-api-access-859th\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.172238 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.172131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.172238 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.172160 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.273479 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.273445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.273638 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.273511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-859th\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-kube-api-access-859th\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.273638 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.273613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.273766 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.273725 2569 secret.go:281] references non-existent secret key: tls.crt Apr 24 23:57:41.273766 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.273742 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 23:57:41.273766 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.273763 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq: references non-existent secret key: tls.crt Apr 24 23:57:41.273906 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.273805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.273906 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.273868 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates podName:6998e575-77b2-4ae3-b1dd-ce86ee14e79a nodeName:}" failed. No retries permitted until 2026-04-24 23:57:41.773846939 +0000 UTC m=+240.378573528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates") pod "keda-metrics-apiserver-7c9f485588-p6vdq" (UID: "6998e575-77b2-4ae3-b1dd-ce86ee14e79a") : references non-existent secret key: tls.crt Apr 24 23:57:41.282667 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.282642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-859th\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-kube-api-access-859th\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.476334 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.476300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:41.476512 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.476469 2569 secret.go:281] references non-existent secret key: ca.crt Apr 24 23:57:41.476512 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.476487 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 23:57:41.476512 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.476498 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5kdns: references non-existent secret key: ca.crt Apr 24 23:57:41.476611 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.476588 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates podName:ad3472c8-7725-4780-8472-54081a48c048 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:42.476568656 +0000 UTC m=+241.081295254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates") pod "keda-operator-ffbb595cb-5kdns" (UID: "ad3472c8-7725-4780-8472-54081a48c048") : references non-existent secret key: ca.crt Apr 24 23:57:41.778591 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:41.778560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:41.779045 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.778706 2569 secret.go:281] references non-existent secret key: tls.crt Apr 24 23:57:41.779045 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.778721 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 23:57:41.779045 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.778740 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq: references non-existent secret key: tls.crt Apr 24 23:57:41.779045 ip-10-0-133-214 kubenswrapper[2569]: E0424 23:57:41.778808 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates podName:6998e575-77b2-4ae3-b1dd-ce86ee14e79a nodeName:}" failed. No retries permitted until 2026-04-24 23:57:42.778789105 +0000 UTC m=+241.383515684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates") pod "keda-metrics-apiserver-7c9f485588-p6vdq" (UID: "6998e575-77b2-4ae3-b1dd-ce86ee14e79a") : references non-existent secret key: tls.crt Apr 24 23:57:42.484592 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.484509 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:42.486910 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.486886 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ad3472c8-7725-4780-8472-54081a48c048-certificates\") pod \"keda-operator-ffbb595cb-5kdns\" (UID: \"ad3472c8-7725-4780-8472-54081a48c048\") " pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:42.511028 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.511001 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5rxdj\"" Apr 24 23:57:42.518989 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.518967 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:42.634202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.634172 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5kdns"] Apr 24 23:57:42.638475 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:57:42.638446 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad3472c8_7725_4780_8472_54081a48c048.slice/crio-4bc8a7033fc44551aeb080905182dbc4a0dfc23ff153097e0e8e159c76c45778 WatchSource:0}: Error finding container 4bc8a7033fc44551aeb080905182dbc4a0dfc23ff153097e0e8e159c76c45778: Status 404 returned error can't find the container with id 4bc8a7033fc44551aeb080905182dbc4a0dfc23ff153097e0e8e159c76c45778 Apr 24 23:57:42.782466 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.782433 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" event={"ID":"ad3472c8-7725-4780-8472-54081a48c048","Type":"ContainerStarted","Data":"4bc8a7033fc44551aeb080905182dbc4a0dfc23ff153097e0e8e159c76c45778"} Apr 24 23:57:42.786794 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.786774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:42.789320 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.789298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6998e575-77b2-4ae3-b1dd-ce86ee14e79a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-p6vdq\" (UID: \"6998e575-77b2-4ae3-b1dd-ce86ee14e79a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:42.866161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.866133 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:42.982061 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:42.982037 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq"] Apr 24 23:57:42.984250 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:57:42.984222 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6998e575_77b2_4ae3_b1dd_ce86ee14e79a.slice/crio-b8ce1474f43f459208bf77cec93616d8f199b1cc3336bbdad364361936f1d077 WatchSource:0}: Error finding container b8ce1474f43f459208bf77cec93616d8f199b1cc3336bbdad364361936f1d077: Status 404 returned error can't find the container with id b8ce1474f43f459208bf77cec93616d8f199b1cc3336bbdad364361936f1d077 Apr 24 23:57:43.786913 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:43.786870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" event={"ID":"6998e575-77b2-4ae3-b1dd-ce86ee14e79a","Type":"ContainerStarted","Data":"b8ce1474f43f459208bf77cec93616d8f199b1cc3336bbdad364361936f1d077"} Apr 24 23:57:47.801890 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:47.801850 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" event={"ID":"6998e575-77b2-4ae3-b1dd-ce86ee14e79a","Type":"ContainerStarted","Data":"9cb431e000acc58bb884950d317eceaf60030eb0d616669e4a8b24cb435b89f4"} Apr 24 23:57:47.802356 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:47.802112 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:57:47.819282 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:47.819220 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" podStartSLOduration=2.749750424 podStartE2EDuration="6.819204084s" podCreationTimestamp="2026-04-24 23:57:41 +0000 UTC" firstStartedPulling="2026-04-24 23:57:42.985587382 +0000 UTC m=+241.590313959" lastFinishedPulling="2026-04-24 23:57:47.055041029 +0000 UTC m=+245.659767619" observedRunningTime="2026-04-24 23:57:47.817717265 +0000 UTC m=+246.422443862" watchObservedRunningTime="2026-04-24 23:57:47.819204084 +0000 UTC m=+246.423930686" Apr 24 23:57:48.806560 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:48.806515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" event={"ID":"ad3472c8-7725-4780-8472-54081a48c048","Type":"ContainerStarted","Data":"fa0ba9696cd3be669b520d98e0fd36820facd6f40080f36890ad6d314e253506"} Apr 24 23:57:48.823806 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:48.823757 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" podStartSLOduration=3.15418685 podStartE2EDuration="8.823742568s" podCreationTimestamp="2026-04-24 23:57:40 +0000 UTC" firstStartedPulling="2026-04-24 23:57:42.639983355 +0000 UTC m=+241.244709935" lastFinishedPulling="2026-04-24 23:57:48.309539073 +0000 UTC m=+246.914265653" observedRunningTime="2026-04-24 23:57:48.822773968 +0000 UTC m=+247.427500566" watchObservedRunningTime="2026-04-24 23:57:48.823742568 +0000 UTC m=+247.428469396" Apr 24 23:57:49.810124 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:49.810089 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:57:58.810731 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:57:58.810703 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-p6vdq" Apr 24 23:58:01.781335 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:01.781259 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-chrxt" Apr 24 23:58:10.815007 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:10.814969 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-5kdns" Apr 24 23:58:33.769111 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.769075 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7"] Apr 24 23:58:33.772433 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.772396 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.774708 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.774685 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:58:33.775441 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.775421 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7pf4m\"" Apr 24 23:58:33.775550 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.775476 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:58:33.780079 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.780059 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7"] Apr 24 23:58:33.856744 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.856713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdfxl\" (UniqueName: \"kubernetes.io/projected/13afc63c-c77b-44ae-bcce-30e512a170cb-kube-api-access-jdfxl\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.856890 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.856749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.856890 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.856777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.957206 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.957177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdfxl\" (UniqueName: \"kubernetes.io/projected/13afc63c-c77b-44ae-bcce-30e512a170cb-kube-api-access-jdfxl\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.957364 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.957213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.957364 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.957239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.957621 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.957606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.957686 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.957669 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:33.964864 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:33.964832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdfxl\" (UniqueName: \"kubernetes.io/projected/13afc63c-c77b-44ae-bcce-30e512a170cb-kube-api-access-jdfxl\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:34.082245 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:34.082152 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:34.199430 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:34.199376 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7"] Apr 24 23:58:34.202024 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:58:34.201997 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13afc63c_c77b_44ae_bcce_30e512a170cb.slice/crio-f5f5672b35417f5c8bda003dedea225cc15005350cba6ef47543778cc4618891 WatchSource:0}: Error finding container f5f5672b35417f5c8bda003dedea225cc15005350cba6ef47543778cc4618891: Status 404 returned error can't find the container with id f5f5672b35417f5c8bda003dedea225cc15005350cba6ef47543778cc4618891 Apr 24 23:58:34.946365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:34.946329 2569 generic.go:358] "Generic (PLEG): container finished" podID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerID="2facc4cb68f97555f9fad940f205f8388a861e1308aa2e710834fd83d718f613" exitCode=0 Apr 24 23:58:34.946762 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:34.946421 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" event={"ID":"13afc63c-c77b-44ae-bcce-30e512a170cb","Type":"ContainerDied","Data":"2facc4cb68f97555f9fad940f205f8388a861e1308aa2e710834fd83d718f613"} Apr 24 23:58:34.946762 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:34.946453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" event={"ID":"13afc63c-c77b-44ae-bcce-30e512a170cb","Type":"ContainerStarted","Data":"f5f5672b35417f5c8bda003dedea225cc15005350cba6ef47543778cc4618891"} Apr 24 23:58:38.959878 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:38.959795 2569 generic.go:358] "Generic (PLEG): container finished" podID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerID="025c402a1f0f5f1603995e2486c886d4b13520c5b44c0cf92f305ffdc91052a6" exitCode=0 Apr 24 23:58:38.959878 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:38.959841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" event={"ID":"13afc63c-c77b-44ae-bcce-30e512a170cb","Type":"ContainerDied","Data":"025c402a1f0f5f1603995e2486c886d4b13520c5b44c0cf92f305ffdc91052a6"} Apr 24 23:58:39.965043 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:39.965011 2569 generic.go:358] "Generic (PLEG): container finished" podID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerID="77ad4dfa4c97174e749d3fe15ebc578766835c50b56fecba199e368b5f3d599a" exitCode=0 Apr 24 23:58:39.965452 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:39.965099 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" event={"ID":"13afc63c-c77b-44ae-bcce-30e512a170cb","Type":"ContainerDied","Data":"77ad4dfa4c97174e749d3fe15ebc578766835c50b56fecba199e368b5f3d599a"} Apr 24 23:58:41.089099 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.089073 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:41.211949 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.211916 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-bundle\") pod \"13afc63c-c77b-44ae-bcce-30e512a170cb\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " Apr 24 23:58:41.212129 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.211962 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-util\") pod \"13afc63c-c77b-44ae-bcce-30e512a170cb\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " Apr 24 23:58:41.212129 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.212032 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdfxl\" (UniqueName: \"kubernetes.io/projected/13afc63c-c77b-44ae-bcce-30e512a170cb-kube-api-access-jdfxl\") pod \"13afc63c-c77b-44ae-bcce-30e512a170cb\" (UID: \"13afc63c-c77b-44ae-bcce-30e512a170cb\") " Apr 24 23:58:41.212681 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.212644 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-bundle" (OuterVolumeSpecName: "bundle") pod "13afc63c-c77b-44ae-bcce-30e512a170cb" (UID: "13afc63c-c77b-44ae-bcce-30e512a170cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:41.214103 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.214075 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13afc63c-c77b-44ae-bcce-30e512a170cb-kube-api-access-jdfxl" (OuterVolumeSpecName: "kube-api-access-jdfxl") pod "13afc63c-c77b-44ae-bcce-30e512a170cb" (UID: "13afc63c-c77b-44ae-bcce-30e512a170cb"). InnerVolumeSpecName "kube-api-access-jdfxl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:41.216602 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.216572 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-util" (OuterVolumeSpecName: "util") pod "13afc63c-c77b-44ae-bcce-30e512a170cb" (UID: "13afc63c-c77b-44ae-bcce-30e512a170cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:41.312538 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.312512 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdfxl\" (UniqueName: \"kubernetes.io/projected/13afc63c-c77b-44ae-bcce-30e512a170cb-kube-api-access-jdfxl\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:58:41.312538 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.312534 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:58:41.312538 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.312545 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13afc63c-c77b-44ae-bcce-30e512a170cb-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:58:41.862368 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.862344 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:58:41.974009 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.973855 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" Apr 24 23:58:41.995730 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.976085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d5qnn7" event={"ID":"13afc63c-c77b-44ae-bcce-30e512a170cb","Type":"ContainerDied","Data":"f5f5672b35417f5c8bda003dedea225cc15005350cba6ef47543778cc4618891"} Apr 24 23:58:41.995730 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:41.976111 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f5672b35417f5c8bda003dedea225cc15005350cba6ef47543778cc4618891" Apr 24 23:58:46.576792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.576759 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562"] Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577022 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="util" Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577032 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="util" Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577039 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="extract" Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577045 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="extract" Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577053 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="pull" Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577059 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="pull" Apr 24 23:58:46.577162 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.577111 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="13afc63c-c77b-44ae-bcce-30e512a170cb" containerName="extract" Apr 24 23:58:46.579993 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.579977 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.582378 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.582353 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:58:46.582378 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.582365 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 24 23:58:46.582649 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.582636 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-pfkdq\"" Apr 24 23:58:46.591740 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.591720 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562"] Apr 24 23:58:46.654589 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.654552 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndvwd\" (UniqueName: \"kubernetes.io/projected/908a4c07-5c4b-4f85-8d6d-db2accb34c28-kube-api-access-ndvwd\") pod \"cert-manager-operator-controller-manager-54b9655956-x2562\" (UID: \"908a4c07-5c4b-4f85-8d6d-db2accb34c28\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.654759 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.654636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/908a4c07-5c4b-4f85-8d6d-db2accb34c28-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-x2562\" (UID: \"908a4c07-5c4b-4f85-8d6d-db2accb34c28\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.756001 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.755972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/908a4c07-5c4b-4f85-8d6d-db2accb34c28-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-x2562\" (UID: \"908a4c07-5c4b-4f85-8d6d-db2accb34c28\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.756134 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.756028 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndvwd\" (UniqueName: \"kubernetes.io/projected/908a4c07-5c4b-4f85-8d6d-db2accb34c28-kube-api-access-ndvwd\") pod \"cert-manager-operator-controller-manager-54b9655956-x2562\" (UID: \"908a4c07-5c4b-4f85-8d6d-db2accb34c28\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.756347 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.756329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/908a4c07-5c4b-4f85-8d6d-db2accb34c28-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-x2562\" (UID: \"908a4c07-5c4b-4f85-8d6d-db2accb34c28\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.764245 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.764218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndvwd\" (UniqueName: \"kubernetes.io/projected/908a4c07-5c4b-4f85-8d6d-db2accb34c28-kube-api-access-ndvwd\") pod \"cert-manager-operator-controller-manager-54b9655956-x2562\" (UID: \"908a4c07-5c4b-4f85-8d6d-db2accb34c28\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:46.888944 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:46.888872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" Apr 24 23:58:47.013251 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:47.013227 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562"] Apr 24 23:58:47.015604 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:58:47.015573 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908a4c07_5c4b_4f85_8d6d_db2accb34c28.slice/crio-8c4d5e00d1c8b7dca0032aa64ddba6a71dbb102d0c34bbda12074e181412fac2 WatchSource:0}: Error finding container 8c4d5e00d1c8b7dca0032aa64ddba6a71dbb102d0c34bbda12074e181412fac2: Status 404 returned error can't find the container with id 8c4d5e00d1c8b7dca0032aa64ddba6a71dbb102d0c34bbda12074e181412fac2 Apr 24 23:58:47.018167 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:47.018148 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:58:47.990206 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:47.990167 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" event={"ID":"908a4c07-5c4b-4f85-8d6d-db2accb34c28","Type":"ContainerStarted","Data":"8c4d5e00d1c8b7dca0032aa64ddba6a71dbb102d0c34bbda12074e181412fac2"} Apr 24 23:58:48.994796 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:48.994716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" event={"ID":"908a4c07-5c4b-4f85-8d6d-db2accb34c28","Type":"ContainerStarted","Data":"b2862eace77164b86c754ace885a5937d4d27160cdc64674aed61b02cccf4576"} Apr 24 23:58:49.014230 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:49.014181 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-x2562" podStartSLOduration=1.391560706 podStartE2EDuration="3.014166021s" podCreationTimestamp="2026-04-24 23:58:46 +0000 UTC" firstStartedPulling="2026-04-24 23:58:47.018350706 +0000 UTC m=+305.623077282" lastFinishedPulling="2026-04-24 23:58:48.640956011 +0000 UTC m=+307.245682597" observedRunningTime="2026-04-24 23:58:49.013592054 +0000 UTC m=+307.618318652" watchObservedRunningTime="2026-04-24 23:58:49.014166021 +0000 UTC m=+307.618892626" Apr 24 23:58:50.302134 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.302100 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp"] Apr 24 23:58:50.305496 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.305481 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.309735 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.309716 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7pf4m\"" Apr 24 23:58:50.309994 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.309975 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:58:50.310815 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.310798 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:58:50.317081 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.317057 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp"] Apr 24 23:58:50.383727 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.383698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndjpv\" (UniqueName: \"kubernetes.io/projected/906e649f-4175-441f-a7bd-90980c4e1ffa-kube-api-access-ndjpv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.383853 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.383753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.383853 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.383832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.484607 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.484577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.484781 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.484646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.484781 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.484698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndjpv\" (UniqueName: \"kubernetes.io/projected/906e649f-4175-441f-a7bd-90980c4e1ffa-kube-api-access-ndjpv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.485070 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.485042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.485179 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.485072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.492808 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.492786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndjpv\" (UniqueName: \"kubernetes.io/projected/906e649f-4175-441f-a7bd-90980c4e1ffa-kube-api-access-ndjpv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.614434 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.614355 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:50.730982 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:50.730959 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp"] Apr 24 23:58:50.733272 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:58:50.733235 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod906e649f_4175_441f_a7bd_90980c4e1ffa.slice/crio-e694d3616ee4129c821788bb61be16d54a18b0cf62b5891928b8c2ddbe30879b WatchSource:0}: Error finding container e694d3616ee4129c821788bb61be16d54a18b0cf62b5891928b8c2ddbe30879b: Status 404 returned error can't find the container with id e694d3616ee4129c821788bb61be16d54a18b0cf62b5891928b8c2ddbe30879b Apr 24 23:58:51.001615 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:51.001576 2569 generic.go:358] "Generic (PLEG): container finished" podID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerID="796a1311f74a1af8d7669e568edddd0df606f2158d295aacaaffb6c5523b0fef" exitCode=0 Apr 24 23:58:51.001758 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:51.001650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" event={"ID":"906e649f-4175-441f-a7bd-90980c4e1ffa","Type":"ContainerDied","Data":"796a1311f74a1af8d7669e568edddd0df606f2158d295aacaaffb6c5523b0fef"} Apr 24 23:58:51.001758 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:51.001681 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" event={"ID":"906e649f-4175-441f-a7bd-90980c4e1ffa","Type":"ContainerStarted","Data":"e694d3616ee4129c821788bb61be16d54a18b0cf62b5891928b8c2ddbe30879b"} Apr 24 23:58:54.012355 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:54.012318 2569 generic.go:358] "Generic (PLEG): container finished" podID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerID="4324716abc58c64f4cdc1e70910b66671f54cbdb2f6271f37a58637d60f2188a" exitCode=0 Apr 24 23:58:54.012747 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:54.012369 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" event={"ID":"906e649f-4175-441f-a7bd-90980c4e1ffa","Type":"ContainerDied","Data":"4324716abc58c64f4cdc1e70910b66671f54cbdb2f6271f37a58637d60f2188a"} Apr 24 23:58:55.022447 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:55.022398 2569 generic.go:358] "Generic (PLEG): container finished" podID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerID="04706e1ae00f4c491bd461ea1956df08e18984657e97eafb677bca2aeae9eb21" exitCode=0 Apr 24 23:58:55.022814 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:55.022470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" event={"ID":"906e649f-4175-441f-a7bd-90980c4e1ffa","Type":"ContainerDied","Data":"04706e1ae00f4c491bd461ea1956df08e18984657e97eafb677bca2aeae9eb21"} Apr 24 23:58:56.141358 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.141337 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:56.231816 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.231787 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-bundle\") pod \"906e649f-4175-441f-a7bd-90980c4e1ffa\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " Apr 24 23:58:56.232001 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.231829 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-util\") pod \"906e649f-4175-441f-a7bd-90980c4e1ffa\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " Apr 24 23:58:56.232001 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.231851 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndjpv\" (UniqueName: \"kubernetes.io/projected/906e649f-4175-441f-a7bd-90980c4e1ffa-kube-api-access-ndjpv\") pod \"906e649f-4175-441f-a7bd-90980c4e1ffa\" (UID: \"906e649f-4175-441f-a7bd-90980c4e1ffa\") " Apr 24 23:58:56.232233 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.232208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-bundle" (OuterVolumeSpecName: "bundle") pod "906e649f-4175-441f-a7bd-90980c4e1ffa" (UID: "906e649f-4175-441f-a7bd-90980c4e1ffa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:56.233880 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.233856 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906e649f-4175-441f-a7bd-90980c4e1ffa-kube-api-access-ndjpv" (OuterVolumeSpecName: "kube-api-access-ndjpv") pod "906e649f-4175-441f-a7bd-90980c4e1ffa" (UID: "906e649f-4175-441f-a7bd-90980c4e1ffa"). InnerVolumeSpecName "kube-api-access-ndjpv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:58:56.236164 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.236145 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-util" (OuterVolumeSpecName: "util") pod "906e649f-4175-441f-a7bd-90980c4e1ffa" (UID: "906e649f-4175-441f-a7bd-90980c4e1ffa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:58:56.332535 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.332468 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:58:56.332535 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.332493 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/906e649f-4175-441f-a7bd-90980c4e1ffa-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:58:56.332535 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:56.332502 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndjpv\" (UniqueName: \"kubernetes.io/projected/906e649f-4175-441f-a7bd-90980c4e1ffa-kube-api-access-ndjpv\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:58:57.029493 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:57.029454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" event={"ID":"906e649f-4175-441f-a7bd-90980c4e1ffa","Type":"ContainerDied","Data":"e694d3616ee4129c821788bb61be16d54a18b0cf62b5891928b8c2ddbe30879b"} Apr 24 23:58:57.029493 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:57.029480 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fm2jbp" Apr 24 23:58:57.029493 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:58:57.029492 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e694d3616ee4129c821788bb61be16d54a18b0cf62b5891928b8c2ddbe30879b" Apr 24 23:59:11.680553 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680516 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9"] Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680822 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="util" Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680833 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="util" Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680841 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="pull" Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680846 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="pull" Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680855 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="extract" Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680861 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="extract" Apr 24 23:59:11.681005 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.680900 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="906e649f-4175-441f-a7bd-90980c4e1ffa" containerName="extract" Apr 24 23:59:11.689969 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.689941 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.691009 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.690983 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9"] Apr 24 23:59:11.692161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.692139 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:59:11.693093 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.693070 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7pf4m\"" Apr 24 23:59:11.693177 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.693112 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:59:11.847863 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.847823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.848039 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.847877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.848039 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.847901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwlnn\" (UniqueName: \"kubernetes.io/projected/1238885d-17e6-407e-953d-2c017f5963f3-kube-api-access-dwlnn\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.948852 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.948760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.948852 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.948817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.948852 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.948842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwlnn\" (UniqueName: \"kubernetes.io/projected/1238885d-17e6-407e-953d-2c017f5963f3-kube-api-access-dwlnn\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.949234 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.949216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.949269 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.949215 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.956497 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.956462 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwlnn\" (UniqueName: \"kubernetes.io/projected/1238885d-17e6-407e-953d-2c017f5963f3-kube-api-access-dwlnn\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:11.999631 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:11.999609 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:12.117786 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:12.117763 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9"] Apr 24 23:59:12.120086 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:12.120055 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1238885d_17e6_407e_953d_2c017f5963f3.slice/crio-87050f2b67731cc1b50d3db0afe466fc40bfbefd023c8cbd902d2f186844d38b WatchSource:0}: Error finding container 87050f2b67731cc1b50d3db0afe466fc40bfbefd023c8cbd902d2f186844d38b: Status 404 returned error can't find the container with id 87050f2b67731cc1b50d3db0afe466fc40bfbefd023c8cbd902d2f186844d38b Apr 24 23:59:13.078514 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:13.078482 2569 generic.go:358] "Generic (PLEG): container finished" podID="1238885d-17e6-407e-953d-2c017f5963f3" containerID="e90a554310d21aeaa576ca4135bdf518d0003116d4543f50eee4f59e80b6049b" exitCode=0 Apr 24 23:59:13.078886 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:13.078522 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" event={"ID":"1238885d-17e6-407e-953d-2c017f5963f3","Type":"ContainerDied","Data":"e90a554310d21aeaa576ca4135bdf518d0003116d4543f50eee4f59e80b6049b"} Apr 24 23:59:13.078886 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:13.078544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" event={"ID":"1238885d-17e6-407e-953d-2c017f5963f3","Type":"ContainerStarted","Data":"87050f2b67731cc1b50d3db0afe466fc40bfbefd023c8cbd902d2f186844d38b"} Apr 24 23:59:15.085303 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:15.085271 2569 generic.go:358] "Generic (PLEG): container finished" podID="1238885d-17e6-407e-953d-2c017f5963f3" containerID="d74ecc1ba9a62c90de815461dea766b76aca0349a8edcb7ae17a4446c0daf6a0" exitCode=0 Apr 24 23:59:15.085703 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:15.085360 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" event={"ID":"1238885d-17e6-407e-953d-2c017f5963f3","Type":"ContainerDied","Data":"d74ecc1ba9a62c90de815461dea766b76aca0349a8edcb7ae17a4446c0daf6a0"} Apr 24 23:59:16.090748 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:16.090714 2569 generic.go:358] "Generic (PLEG): container finished" podID="1238885d-17e6-407e-953d-2c017f5963f3" containerID="b6ca166b0fff8f162b48afc4cc6ccf681648b59cf8d5b3d5aefb3c28a570fb71" exitCode=0 Apr 24 23:59:16.091103 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:16.090775 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" event={"ID":"1238885d-17e6-407e-953d-2c017f5963f3","Type":"ContainerDied","Data":"b6ca166b0fff8f162b48afc4cc6ccf681648b59cf8d5b3d5aefb3c28a570fb71"} Apr 24 23:59:17.211895 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.211870 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:17.391615 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.391534 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwlnn\" (UniqueName: \"kubernetes.io/projected/1238885d-17e6-407e-953d-2c017f5963f3-kube-api-access-dwlnn\") pod \"1238885d-17e6-407e-953d-2c017f5963f3\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " Apr 24 23:59:17.391756 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.391633 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-bundle\") pod \"1238885d-17e6-407e-953d-2c017f5963f3\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " Apr 24 23:59:17.391756 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.391665 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-util\") pod \"1238885d-17e6-407e-953d-2c017f5963f3\" (UID: \"1238885d-17e6-407e-953d-2c017f5963f3\") " Apr 24 23:59:17.392475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.392440 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-bundle" (OuterVolumeSpecName: "bundle") pod "1238885d-17e6-407e-953d-2c017f5963f3" (UID: "1238885d-17e6-407e-953d-2c017f5963f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:17.393728 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.393705 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1238885d-17e6-407e-953d-2c017f5963f3-kube-api-access-dwlnn" (OuterVolumeSpecName: "kube-api-access-dwlnn") pod "1238885d-17e6-407e-953d-2c017f5963f3" (UID: "1238885d-17e6-407e-953d-2c017f5963f3"). InnerVolumeSpecName "kube-api-access-dwlnn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:17.397347 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.397311 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-util" (OuterVolumeSpecName: "util") pod "1238885d-17e6-407e-953d-2c017f5963f3" (UID: "1238885d-17e6-407e-953d-2c017f5963f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:17.492564 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.492537 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:17.492564 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.492559 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1238885d-17e6-407e-953d-2c017f5963f3-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:17.492718 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:17.492570 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwlnn\" (UniqueName: \"kubernetes.io/projected/1238885d-17e6-407e-953d-2c017f5963f3-kube-api-access-dwlnn\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:18.099212 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:18.099180 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" Apr 24 23:59:18.099380 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:18.099177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356r9r9" event={"ID":"1238885d-17e6-407e-953d-2c017f5963f3","Type":"ContainerDied","Data":"87050f2b67731cc1b50d3db0afe466fc40bfbefd023c8cbd902d2f186844d38b"} Apr 24 23:59:18.099380 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:18.099293 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87050f2b67731cc1b50d3db0afe466fc40bfbefd023c8cbd902d2f186844d38b" Apr 24 23:59:26.211686 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.211651 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5"] Apr 24 23:59:26.212311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212288 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="util" Apr 24 23:59:26.212311 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212310 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="util" Apr 24 23:59:26.212468 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212335 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="extract" Apr 24 23:59:26.212468 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212344 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="extract" Apr 24 23:59:26.212468 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212377 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="pull" Apr 24 23:59:26.212468 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212385 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="pull" Apr 24 23:59:26.212625 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.212527 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1238885d-17e6-407e-953d-2c017f5963f3" containerName="extract" Apr 24 23:59:26.217894 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.217872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.221893 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.221870 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:59:26.222060 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.222043 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:59:26.222221 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.222193 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7pf4m\"" Apr 24 23:59:26.224433 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.224396 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5"] Apr 24 23:59:26.360020 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.359990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jckj\" (UniqueName: \"kubernetes.io/projected/073c83e5-a124-4d73-b63f-31173997510d-kube-api-access-8jckj\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.360179 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.360025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.360179 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.360056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.460887 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.460853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.461062 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.460925 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jckj\" (UniqueName: \"kubernetes.io/projected/073c83e5-a124-4d73-b63f-31173997510d-kube-api-access-8jckj\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.461062 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.460954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.461259 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.461236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.461335 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.461282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.475292 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.475220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jckj\" (UniqueName: \"kubernetes.io/projected/073c83e5-a124-4d73-b63f-31173997510d-kube-api-access-8jckj\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.528080 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.528054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:26.651036 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:26.650981 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5"] Apr 24 23:59:26.655501 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:26.655471 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073c83e5_a124_4d73_b63f_31173997510d.slice/crio-ff6105382b56891a2c3c0099ab0cc9bd17195e076802ace9c2b8a0d0c8c8c154 WatchSource:0}: Error finding container ff6105382b56891a2c3c0099ab0cc9bd17195e076802ace9c2b8a0d0c8c8c154: Status 404 returned error can't find the container with id ff6105382b56891a2c3c0099ab0cc9bd17195e076802ace9c2b8a0d0c8c8c154 Apr 24 23:59:27.127364 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:27.127272 2569 generic.go:358] "Generic (PLEG): container finished" podID="073c83e5-a124-4d73-b63f-31173997510d" containerID="f00a0c7e62a3340fbe3d0cf49a7cf551895f6f7216e6b942d0459cf1eaa3017a" exitCode=0 Apr 24 23:59:27.127364 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:27.127349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" event={"ID":"073c83e5-a124-4d73-b63f-31173997510d","Type":"ContainerDied","Data":"f00a0c7e62a3340fbe3d0cf49a7cf551895f6f7216e6b942d0459cf1eaa3017a"} Apr 24 23:59:27.127556 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:27.127382 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" event={"ID":"073c83e5-a124-4d73-b63f-31173997510d","Type":"ContainerStarted","Data":"ff6105382b56891a2c3c0099ab0cc9bd17195e076802ace9c2b8a0d0c8c8c154"} Apr 24 23:59:29.134070 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:29.134039 2569 generic.go:358] "Generic (PLEG): container finished" podID="073c83e5-a124-4d73-b63f-31173997510d" containerID="020e3fe438e36b09916341807532fc3a200d326d3e4a98a10d86d8e074576f99" exitCode=0 Apr 24 23:59:29.134457 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:29.134115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" event={"ID":"073c83e5-a124-4d73-b63f-31173997510d","Type":"ContainerDied","Data":"020e3fe438e36b09916341807532fc3a200d326d3e4a98a10d86d8e074576f99"} Apr 24 23:59:30.139124 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:30.139091 2569 generic.go:358] "Generic (PLEG): container finished" podID="073c83e5-a124-4d73-b63f-31173997510d" containerID="412a115b6a770e740e85fc28c361edb993ee4ad5aea254e3826d4468804c961c" exitCode=0 Apr 24 23:59:30.139601 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:30.139165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" event={"ID":"073c83e5-a124-4d73-b63f-31173997510d","Type":"ContainerDied","Data":"412a115b6a770e740e85fc28c361edb993ee4ad5aea254e3826d4468804c961c"} Apr 24 23:59:31.269688 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.269668 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:31.298304 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.298283 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jckj\" (UniqueName: \"kubernetes.io/projected/073c83e5-a124-4d73-b63f-31173997510d-kube-api-access-8jckj\") pod \"073c83e5-a124-4d73-b63f-31173997510d\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " Apr 24 23:59:31.298458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.298329 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-bundle\") pod \"073c83e5-a124-4d73-b63f-31173997510d\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " Apr 24 23:59:31.298458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.298357 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-util\") pod \"073c83e5-a124-4d73-b63f-31173997510d\" (UID: \"073c83e5-a124-4d73-b63f-31173997510d\") " Apr 24 23:59:31.299206 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.299181 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-bundle" (OuterVolumeSpecName: "bundle") pod "073c83e5-a124-4d73-b63f-31173997510d" (UID: "073c83e5-a124-4d73-b63f-31173997510d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:31.300482 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.300460 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073c83e5-a124-4d73-b63f-31173997510d-kube-api-access-8jckj" (OuterVolumeSpecName: "kube-api-access-8jckj") pod "073c83e5-a124-4d73-b63f-31173997510d" (UID: "073c83e5-a124-4d73-b63f-31173997510d"). InnerVolumeSpecName "kube-api-access-8jckj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:31.304152 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.303988 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-util" (OuterVolumeSpecName: "util") pod "073c83e5-a124-4d73-b63f-31173997510d" (UID: "073c83e5-a124-4d73-b63f-31173997510d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:31.399655 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.399582 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:31.399655 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.399612 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8jckj\" (UniqueName: \"kubernetes.io/projected/073c83e5-a124-4d73-b63f-31173997510d-kube-api-access-8jckj\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:31.399655 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.399627 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/073c83e5-a124-4d73-b63f-31173997510d-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:31.875768 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.875735 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc"] Apr 24 23:59:31.876042 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876030 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="extract" Apr 24 23:59:31.876085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876043 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="extract" Apr 24 23:59:31.876085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876054 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="pull" Apr 24 23:59:31.876085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876060 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="pull" Apr 24 23:59:31.876085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876075 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="util" Apr 24 23:59:31.876085 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876080 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="util" Apr 24 23:59:31.876223 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.876121 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="073c83e5-a124-4d73-b63f-31173997510d" containerName="extract" Apr 24 23:59:31.878941 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.878927 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.882655 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.882631 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 23:59:31.882788 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.882724 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 23:59:31.882927 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.882909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-6v76r\"" Apr 24 23:59:31.883175 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.883160 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 23:59:31.883222 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.883199 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 23:59:31.883623 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.883607 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 23:59:31.883724 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.883608 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 23:59:31.889481 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.889453 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc"] Apr 24 23:59:31.903081 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.903190 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903092 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.903190 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxlt\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-kube-api-access-gzxlt\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.903190 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.903348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903210 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.903348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903231 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:31.903348 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:31.903270 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.003768 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.003739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.003903 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.003780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.003903 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.003812 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxlt\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-kube-api-access-gzxlt\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.003903 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.003881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.004052 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.003944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.004052 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.003969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.004255 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.004228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.005007 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.004980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.006161 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.006141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.006324 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.006306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.006386 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.006323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.006700 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.006683 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.011539 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.011512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.011780 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.011762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxlt\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-kube-api-access-gzxlt\") pod \"istiod-openshift-gateway-7cd77c7ffd-q42qc\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.148902 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.148818 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" Apr 24 23:59:32.148902 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.148824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebk88n5" event={"ID":"073c83e5-a124-4d73-b63f-31173997510d","Type":"ContainerDied","Data":"ff6105382b56891a2c3c0099ab0cc9bd17195e076802ace9c2b8a0d0c8c8c154"} Apr 24 23:59:32.148902 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.148857 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff6105382b56891a2c3c0099ab0cc9bd17195e076802ace9c2b8a0d0c8c8c154" Apr 24 23:59:32.188181 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.188153 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:32.319698 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:32.319669 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc"] Apr 24 23:59:32.323033 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:32.323010 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0a9deeb_3889_4e4b_bca0_c25ded8bb5f2.slice/crio-a37a2cf92f55709d6e53e33d697ff2a3e5bb3bdc0ab718e81fd887622ecc73c6 WatchSource:0}: Error finding container a37a2cf92f55709d6e53e33d697ff2a3e5bb3bdc0ab718e81fd887622ecc73c6: Status 404 returned error can't find the container with id a37a2cf92f55709d6e53e33d697ff2a3e5bb3bdc0ab718e81fd887622ecc73c6 Apr 24 23:59:33.154319 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:33.154283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" event={"ID":"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2","Type":"ContainerStarted","Data":"a37a2cf92f55709d6e53e33d697ff2a3e5bb3bdc0ab718e81fd887622ecc73c6"} Apr 24 23:59:34.660731 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:34.660691 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 24 23:59:34.661082 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:34.660768 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 24 23:59:35.161928 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:35.161889 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" event={"ID":"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2","Type":"ContainerStarted","Data":"44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31"} Apr 24 23:59:35.162104 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:35.162091 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:35.163593 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:35.163569 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-q42qc container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 24 23:59:35.163704 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:35.163625 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" podUID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:59:35.184071 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:35.184030 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" podStartSLOduration=1.848872788 podStartE2EDuration="4.183993307s" podCreationTimestamp="2026-04-24 23:59:31 +0000 UTC" firstStartedPulling="2026-04-24 23:59:32.325373486 +0000 UTC m=+350.930100063" lastFinishedPulling="2026-04-24 23:59:34.660494006 +0000 UTC m=+353.265220582" observedRunningTime="2026-04-24 23:59:35.182897416 +0000 UTC m=+353.787624015" watchObservedRunningTime="2026-04-24 23:59:35.183993307 +0000 UTC m=+353.788719905" Apr 24 23:59:36.166849 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:36.166821 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 24 23:59:46.069632 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.069597 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs"] Apr 24 23:59:46.073222 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.073206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.075549 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.075524 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:59:46.076194 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.076166 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:59:46.076299 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.076189 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7pf4m\"" Apr 24 23:59:46.080692 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.080671 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs"] Apr 24 23:59:46.119981 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.119949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.120123 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.120006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.120123 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.120038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxcng\" (UniqueName: \"kubernetes.io/projected/f27b97cc-170a-465c-9ce2-423113e2a5f1-kube-api-access-qxcng\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.176081 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.176049 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92"] Apr 24 23:59:46.179215 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.179199 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.187175 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.187153 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92"] Apr 24 23:59:46.220544 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220515 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcpp\" (UniqueName: \"kubernetes.io/projected/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-kube-api-access-qmcpp\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.220693 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.220693 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220619 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.220693 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxcng\" (UniqueName: \"kubernetes.io/projected/f27b97cc-170a-465c-9ce2-423113e2a5f1-kube-api-access-qxcng\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.220693 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.220841 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.220931 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.220983 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.220970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.229545 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.229523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxcng\" (UniqueName: \"kubernetes.io/projected/f27b97cc-170a-465c-9ce2-423113e2a5f1-kube-api-access-qxcng\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.281901 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.281876 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62"] Apr 24 23:59:46.286720 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.286694 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.291749 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.291719 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62"] Apr 24 23:59:46.321856 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.321788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.321856 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.321826 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcpp\" (UniqueName: \"kubernetes.io/projected/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-kube-api-access-qmcpp\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.321856 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.321851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.322050 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.321908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.322050 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.321957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.322050 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.321986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hkk\" (UniqueName: \"kubernetes.io/projected/0db4b491-49f0-499f-be74-9f1f25fc5465-kube-api-access-p5hkk\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.322219 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.322203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.322303 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.322233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.331795 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.331773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcpp\" (UniqueName: \"kubernetes.io/projected/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-kube-api-access-qmcpp\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.373458 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.373425 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm"] Apr 24 23:59:46.376808 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.376790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.382605 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.382583 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:46.384012 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.383988 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm"] Apr 24 23:59:46.422976 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.422947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.423159 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.422983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.423159 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.423009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.423159 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.423032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hkk\" (UniqueName: \"kubernetes.io/projected/0db4b491-49f0-499f-be74-9f1f25fc5465-kube-api-access-p5hkk\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.423159 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.423057 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.423159 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.423103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9t2\" (UniqueName: \"kubernetes.io/projected/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-kube-api-access-8v9t2\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.423558 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.423529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.423634 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.423540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.436383 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.436355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hkk\" (UniqueName: \"kubernetes.io/projected/0db4b491-49f0-499f-be74-9f1f25fc5465-kube-api-access-p5hkk\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.489088 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.489058 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:46.503990 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.503968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs"] Apr 24 23:59:46.505723 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:46.505695 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27b97cc_170a_465c_9ce2_423113e2a5f1.slice/crio-6d4717e266946aabccf3043557e117cd63a9af6d0d48490f65bc7a948f5d21a9 WatchSource:0}: Error finding container 6d4717e266946aabccf3043557e117cd63a9af6d0d48490f65bc7a948f5d21a9: Status 404 returned error can't find the container with id 6d4717e266946aabccf3043557e117cd63a9af6d0d48490f65bc7a948f5d21a9 Apr 24 23:59:46.524136 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.524110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9t2\" (UniqueName: \"kubernetes.io/projected/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-kube-api-access-8v9t2\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.524238 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.524165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.524238 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.524191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.524555 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.524534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.524618 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.524582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.533094 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.533071 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9t2\" (UniqueName: \"kubernetes.io/projected/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-kube-api-access-8v9t2\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.598658 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.598578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:46.611722 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.611690 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92"] Apr 24 23:59:46.613740 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:46.613710 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb3c5d2_19c2_4bad_8107_7a97bc0dc86e.slice/crio-0b1b48b4091e74fc033eba5550f1482b53c1938fa6f8ea1de9610bae25b0c39a WatchSource:0}: Error finding container 0b1b48b4091e74fc033eba5550f1482b53c1938fa6f8ea1de9610bae25b0c39a: Status 404 returned error can't find the container with id 0b1b48b4091e74fc033eba5550f1482b53c1938fa6f8ea1de9610bae25b0c39a Apr 24 23:59:46.686856 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.686834 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:46.727844 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.727811 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62"] Apr 24 23:59:46.731059 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:46.731030 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0db4b491_49f0_499f_be74_9f1f25fc5465.slice/crio-cd84915c31d88c81fea91a847419996c837cd55882c835b1b88c8a98aa6c019c WatchSource:0}: Error finding container cd84915c31d88c81fea91a847419996c837cd55882c835b1b88c8a98aa6c019c: Status 404 returned error can't find the container with id cd84915c31d88c81fea91a847419996c837cd55882c835b1b88c8a98aa6c019c Apr 24 23:59:46.817874 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:46.817851 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm"] Apr 24 23:59:46.823825 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:46.823793 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3f2616b_7a4f_49dd_ba8e_43ad9b1e50ca.slice/crio-18187e27daedbcdb32e6baf5cdba6e76faa8038374699b881784a171156361e9 WatchSource:0}: Error finding container 18187e27daedbcdb32e6baf5cdba6e76faa8038374699b881784a171156361e9: Status 404 returned error can't find the container with id 18187e27daedbcdb32e6baf5cdba6e76faa8038374699b881784a171156361e9 Apr 24 23:59:47.201885 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.201796 2569 generic.go:358] "Generic (PLEG): container finished" podID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerID="330d9268b463fff4e1898a9cd657a8eaaee4e3925806345f8906f0d49ceff078" exitCode=0 Apr 24 23:59:47.202292 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.201887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" event={"ID":"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e","Type":"ContainerDied","Data":"330d9268b463fff4e1898a9cd657a8eaaee4e3925806345f8906f0d49ceff078"} Apr 24 23:59:47.202292 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.201925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" event={"ID":"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e","Type":"ContainerStarted","Data":"0b1b48b4091e74fc033eba5550f1482b53c1938fa6f8ea1de9610bae25b0c39a"} Apr 24 23:59:47.203290 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.203205 2569 generic.go:358] "Generic (PLEG): container finished" podID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerID="e2385899e5d70662136175c055e1f66f3a6a526c84fd5319604ad32e084a8e71" exitCode=0 Apr 24 23:59:47.203290 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.203239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" event={"ID":"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca","Type":"ContainerDied","Data":"e2385899e5d70662136175c055e1f66f3a6a526c84fd5319604ad32e084a8e71"} Apr 24 23:59:47.203290 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.203281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" event={"ID":"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca","Type":"ContainerStarted","Data":"18187e27daedbcdb32e6baf5cdba6e76faa8038374699b881784a171156361e9"} Apr 24 23:59:47.204684 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.204661 2569 generic.go:358] "Generic (PLEG): container finished" podID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerID="87033783131f820435aca0e77d9b2feb219bfc8f3396c720144ef483db7e0634" exitCode=0 Apr 24 23:59:47.204791 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.204691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" event={"ID":"0db4b491-49f0-499f-be74-9f1f25fc5465","Type":"ContainerDied","Data":"87033783131f820435aca0e77d9b2feb219bfc8f3396c720144ef483db7e0634"} Apr 24 23:59:47.204791 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.204726 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" event={"ID":"0db4b491-49f0-499f-be74-9f1f25fc5465","Type":"ContainerStarted","Data":"cd84915c31d88c81fea91a847419996c837cd55882c835b1b88c8a98aa6c019c"} Apr 24 23:59:47.206220 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.206067 2569 generic.go:358] "Generic (PLEG): container finished" podID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerID="363489072a424018fe48a0ab9bedd856604cd5577b2d278d8ede58edd88d0a53" exitCode=0 Apr 24 23:59:47.206220 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.206133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" event={"ID":"f27b97cc-170a-465c-9ce2-423113e2a5f1","Type":"ContainerDied","Data":"363489072a424018fe48a0ab9bedd856604cd5577b2d278d8ede58edd88d0a53"} Apr 24 23:59:47.206220 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:47.206153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" event={"ID":"f27b97cc-170a-465c-9ce2-423113e2a5f1","Type":"ContainerStarted","Data":"6d4717e266946aabccf3043557e117cd63a9af6d0d48490f65bc7a948f5d21a9"} Apr 24 23:59:49.214756 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.214726 2569 generic.go:358] "Generic (PLEG): container finished" podID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerID="0469797ce4db4a316ca9f2aabadb58bcd302a974989aa5a5df804e0709f0a810" exitCode=0 Apr 24 23:59:49.215213 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.214811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" event={"ID":"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e","Type":"ContainerDied","Data":"0469797ce4db4a316ca9f2aabadb58bcd302a974989aa5a5df804e0709f0a810"} Apr 24 23:59:49.216303 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.216280 2569 generic.go:358] "Generic (PLEG): container finished" podID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerID="266f0689a91ca09485de0bae1f5c7c7b4968f3602dc28703f2affc645d2e3d6e" exitCode=0 Apr 24 23:59:49.216438 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.216356 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" event={"ID":"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca","Type":"ContainerDied","Data":"266f0689a91ca09485de0bae1f5c7c7b4968f3602dc28703f2affc645d2e3d6e"} Apr 24 23:59:49.218017 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.217935 2569 generic.go:358] "Generic (PLEG): container finished" podID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerID="5eacd27a266e1779502138c39b654ef21d64f1643a8813007650c3994fd6c664" exitCode=0 Apr 24 23:59:49.218088 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.218031 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" event={"ID":"0db4b491-49f0-499f-be74-9f1f25fc5465","Type":"ContainerDied","Data":"5eacd27a266e1779502138c39b654ef21d64f1643a8813007650c3994fd6c664"} Apr 24 23:59:49.219705 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.219683 2569 generic.go:358] "Generic (PLEG): container finished" podID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerID="6327b482567e80fe9f1d2ac1cc0f3cd86f5e7518a8a7662ebc0459048c08cc7e" exitCode=0 Apr 24 23:59:49.219806 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:49.219716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" event={"ID":"f27b97cc-170a-465c-9ce2-423113e2a5f1","Type":"ContainerDied","Data":"6327b482567e80fe9f1d2ac1cc0f3cd86f5e7518a8a7662ebc0459048c08cc7e"} Apr 24 23:59:50.224593 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.224560 2569 generic.go:358] "Generic (PLEG): container finished" podID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerID="4622a390ae0c044a8035a4c6b6af66cb381980057f68e1c78db73ae8a29f815b" exitCode=0 Apr 24 23:59:50.225021 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.224644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" event={"ID":"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca","Type":"ContainerDied","Data":"4622a390ae0c044a8035a4c6b6af66cb381980057f68e1c78db73ae8a29f815b"} Apr 24 23:59:50.226385 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.226364 2569 generic.go:358] "Generic (PLEG): container finished" podID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerID="694544bd99ba99625cb45e14ac2905c4cc895dfca2c482aae25a3a02378e09ca" exitCode=0 Apr 24 23:59:50.226490 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.226449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" event={"ID":"0db4b491-49f0-499f-be74-9f1f25fc5465","Type":"ContainerDied","Data":"694544bd99ba99625cb45e14ac2905c4cc895dfca2c482aae25a3a02378e09ca"} Apr 24 23:59:50.228231 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.228212 2569 generic.go:358] "Generic (PLEG): container finished" podID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerID="b0525b8a30a80609864b7bfe70c5b574527a8d6dce43fd1b5cd1694026a00519" exitCode=0 Apr 24 23:59:50.228331 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.228308 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" event={"ID":"f27b97cc-170a-465c-9ce2-423113e2a5f1","Type":"ContainerDied","Data":"b0525b8a30a80609864b7bfe70c5b574527a8d6dce43fd1b5cd1694026a00519"} Apr 24 23:59:50.229986 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.229964 2569 generic.go:358] "Generic (PLEG): container finished" podID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerID="2823fdd30dc7c8addb945904d2d2ca8f450d60bfeb3bca5d6476e1a22364832c" exitCode=0 Apr 24 23:59:50.230063 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:50.229988 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" event={"ID":"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e","Type":"ContainerDied","Data":"2823fdd30dc7c8addb945904d2d2ca8f450d60bfeb3bca5d6476e1a22364832c"} Apr 24 23:59:51.384783 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.384763 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:51.419000 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.418978 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:51.422514 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.422493 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:51.426526 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.426506 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:51.466064 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466040 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-util\") pod \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " Apr 24 23:59:51.466202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466081 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxcng\" (UniqueName: \"kubernetes.io/projected/f27b97cc-170a-465c-9ce2-423113e2a5f1-kube-api-access-qxcng\") pod \"f27b97cc-170a-465c-9ce2-423113e2a5f1\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " Apr 24 23:59:51.466202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466107 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-util\") pod \"0db4b491-49f0-499f-be74-9f1f25fc5465\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " Apr 24 23:59:51.466202 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466126 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9t2\" (UniqueName: \"kubernetes.io/projected/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-kube-api-access-8v9t2\") pod \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " Apr 24 23:59:51.466388 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466236 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5hkk\" (UniqueName: \"kubernetes.io/projected/0db4b491-49f0-499f-be74-9f1f25fc5465-kube-api-access-p5hkk\") pod \"0db4b491-49f0-499f-be74-9f1f25fc5465\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " Apr 24 23:59:51.466388 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466281 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-bundle\") pod \"f27b97cc-170a-465c-9ce2-423113e2a5f1\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " Apr 24 23:59:51.466388 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466315 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-util\") pod \"f27b97cc-170a-465c-9ce2-423113e2a5f1\" (UID: \"f27b97cc-170a-465c-9ce2-423113e2a5f1\") " Apr 24 23:59:51.466582 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466390 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-bundle\") pod \"0db4b491-49f0-499f-be74-9f1f25fc5465\" (UID: \"0db4b491-49f0-499f-be74-9f1f25fc5465\") " Apr 24 23:59:51.466582 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.466449 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-bundle\") pod \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\" (UID: \"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca\") " Apr 24 23:59:51.467667 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.467103 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-bundle" (OuterVolumeSpecName: "bundle") pod "a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" (UID: "a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.467667 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.467620 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-bundle" (OuterVolumeSpecName: "bundle") pod "0db4b491-49f0-499f-be74-9f1f25fc5465" (UID: "0db4b491-49f0-499f-be74-9f1f25fc5465"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.468374 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.468343 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-bundle" (OuterVolumeSpecName: "bundle") pod "f27b97cc-170a-465c-9ce2-423113e2a5f1" (UID: "f27b97cc-170a-465c-9ce2-423113e2a5f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.468639 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.468608 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-kube-api-access-8v9t2" (OuterVolumeSpecName: "kube-api-access-8v9t2") pod "a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" (UID: "a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca"). InnerVolumeSpecName "kube-api-access-8v9t2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:51.468748 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.468705 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27b97cc-170a-465c-9ce2-423113e2a5f1-kube-api-access-qxcng" (OuterVolumeSpecName: "kube-api-access-qxcng") pod "f27b97cc-170a-465c-9ce2-423113e2a5f1" (UID: "f27b97cc-170a-465c-9ce2-423113e2a5f1"). InnerVolumeSpecName "kube-api-access-qxcng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:51.469647 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.469624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db4b491-49f0-499f-be74-9f1f25fc5465-kube-api-access-p5hkk" (OuterVolumeSpecName: "kube-api-access-p5hkk") pod "0db4b491-49f0-499f-be74-9f1f25fc5465" (UID: "0db4b491-49f0-499f-be74-9f1f25fc5465"). InnerVolumeSpecName "kube-api-access-p5hkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:51.471758 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.471736 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-util" (OuterVolumeSpecName: "util") pod "0db4b491-49f0-499f-be74-9f1f25fc5465" (UID: "0db4b491-49f0-499f-be74-9f1f25fc5465"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.472996 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.472884 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-util" (OuterVolumeSpecName: "util") pod "a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" (UID: "a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.473397 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.473358 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-util" (OuterVolumeSpecName: "util") pod "f27b97cc-170a-465c-9ce2-423113e2a5f1" (UID: "f27b97cc-170a-465c-9ce2-423113e2a5f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.567940 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.567898 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcpp\" (UniqueName: \"kubernetes.io/projected/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-kube-api-access-qmcpp\") pod \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " Apr 24 23:59:51.568105 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.567961 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-util\") pod \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " Apr 24 23:59:51.568105 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.567995 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-bundle\") pod \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\" (UID: \"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e\") " Apr 24 23:59:51.568105 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568103 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qxcng\" (UniqueName: \"kubernetes.io/projected/f27b97cc-170a-465c-9ce2-423113e2a5f1-kube-api-access-qxcng\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568113 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568122 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8v9t2\" (UniqueName: \"kubernetes.io/projected/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-kube-api-access-8v9t2\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568131 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5hkk\" (UniqueName: \"kubernetes.io/projected/0db4b491-49f0-499f-be74-9f1f25fc5465-kube-api-access-p5hkk\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568142 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568150 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f27b97cc-170a-465c-9ce2-423113e2a5f1-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568158 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0db4b491-49f0-499f-be74-9f1f25fc5465-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568167 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568261 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568174 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.568792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.568768 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-bundle" (OuterVolumeSpecName: "bundle") pod "cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" (UID: "cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.569924 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.569904 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-kube-api-access-qmcpp" (OuterVolumeSpecName: "kube-api-access-qmcpp") pod "cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" (UID: "cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e"). InnerVolumeSpecName "kube-api-access-qmcpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:51.573106 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.573086 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-util" (OuterVolumeSpecName: "util") pod "cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" (UID: "cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:59:51.669475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.669442 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmcpp\" (UniqueName: \"kubernetes.io/projected/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-kube-api-access-qmcpp\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.669475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.669471 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-util\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:51.669475 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:51.669482 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 24 23:59:52.239122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.239078 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" event={"ID":"cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e","Type":"ContainerDied","Data":"0b1b48b4091e74fc033eba5550f1482b53c1938fa6f8ea1de9610bae25b0c39a"} Apr 24 23:59:52.239122 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.239122 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b1b48b4091e74fc033eba5550f1482b53c1938fa6f8ea1de9610bae25b0c39a" Apr 24 23:59:52.239333 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.239131 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lbp92" Apr 24 23:59:52.240945 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.240870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" event={"ID":"a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca","Type":"ContainerDied","Data":"18187e27daedbcdb32e6baf5cdba6e76faa8038374699b881784a171156361e9"} Apr 24 23:59:52.240945 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.240905 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18187e27daedbcdb32e6baf5cdba6e76faa8038374699b881784a171156361e9" Apr 24 23:59:52.240945 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.240911 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88l9dnm" Apr 24 23:59:52.242983 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.242921 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" Apr 24 23:59:52.243121 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.242921 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ftp62" event={"ID":"0db4b491-49f0-499f-be74-9f1f25fc5465","Type":"ContainerDied","Data":"cd84915c31d88c81fea91a847419996c837cd55882c835b1b88c8a98aa6c019c"} Apr 24 23:59:52.243180 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.243166 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd84915c31d88c81fea91a847419996c837cd55882c835b1b88c8a98aa6c019c" Apr 24 23:59:52.244746 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.244719 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" event={"ID":"f27b97cc-170a-465c-9ce2-423113e2a5f1","Type":"ContainerDied","Data":"6d4717e266946aabccf3043557e117cd63a9af6d0d48490f65bc7a948f5d21a9"} Apr 24 23:59:52.244854 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.244745 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d4717e266946aabccf3043557e117cd63a9af6d0d48490f65bc7a948f5d21a9" Apr 24 23:59:52.244854 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:52.244776 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bcw8qs" Apr 24 23:59:57.483952 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.483921 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-mzz8b"] Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484245 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="extract" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484256 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="extract" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484265 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="util" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484270 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="util" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484278 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="extract" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484285 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="extract" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484292 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="util" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484302 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="util" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484309 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="extract" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484314 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="extract" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484322 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="pull" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484327 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="pull" Apr 24 23:59:57.484328 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484332 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="extract" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484337 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="extract" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484344 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="pull" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484348 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="pull" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484360 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="util" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484365 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="util" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484369 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="pull" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484374 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="pull" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484380 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="util" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484384 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="util" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484390 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="pull" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484395 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="pull" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484455 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27b97cc-170a-465c-9ce2-423113e2a5f1" containerName="extract" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484463 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3f2616b-7a4f-49dd-ba8e-43ad9b1e50ca" containerName="extract" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484470 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0db4b491-49f0-499f-be74-9f1f25fc5465" containerName="extract" Apr 24 23:59:57.484792 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.484475 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbb3c5d2-19c2-4bad-8107-7a97bc0dc86e" containerName="extract" Apr 24 23:59:57.488709 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.488693 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 24 23:59:57.491028 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.491007 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 23:59:57.491156 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.491135 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5sgkc\"" Apr 24 23:59:57.491222 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.491201 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 23:59:57.502525 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.502496 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-mzz8b"] Apr 24 23:59:57.616286 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.616250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsjrh\" (UniqueName: \"kubernetes.io/projected/ee6bf646-424e-4677-afbe-0c00df0548c7-kube-api-access-hsjrh\") pod \"authorino-operator-7587b89b76-mzz8b\" (UID: \"ee6bf646-424e-4677-afbe-0c00df0548c7\") " pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 24 23:59:57.717046 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.717012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsjrh\" (UniqueName: \"kubernetes.io/projected/ee6bf646-424e-4677-afbe-0c00df0548c7-kube-api-access-hsjrh\") pod \"authorino-operator-7587b89b76-mzz8b\" (UID: \"ee6bf646-424e-4677-afbe-0c00df0548c7\") " pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 24 23:59:57.725275 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.725252 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsjrh\" (UniqueName: \"kubernetes.io/projected/ee6bf646-424e-4677-afbe-0c00df0548c7-kube-api-access-hsjrh\") pod \"authorino-operator-7587b89b76-mzz8b\" (UID: \"ee6bf646-424e-4677-afbe-0c00df0548c7\") " pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 24 23:59:57.799339 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.799309 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 24 23:59:57.923526 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:57.923418 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-mzz8b"] Apr 24 23:59:57.925874 ip-10-0-133-214 kubenswrapper[2569]: W0424 23:59:57.925845 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee6bf646_424e_4677_afbe_0c00df0548c7.slice/crio-5639995c1015a84217b5d8e3f361fa01a31bd47e0b1e44ef475cd5684e59971e WatchSource:0}: Error finding container 5639995c1015a84217b5d8e3f361fa01a31bd47e0b1e44ef475cd5684e59971e: Status 404 returned error can't find the container with id 5639995c1015a84217b5d8e3f361fa01a31bd47e0b1e44ef475cd5684e59971e Apr 24 23:59:58.266754 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:58.266723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" event={"ID":"ee6bf646-424e-4677-afbe-0c00df0548c7","Type":"ContainerStarted","Data":"5639995c1015a84217b5d8e3f361fa01a31bd47e0b1e44ef475cd5684e59971e"} Apr 24 23:59:59.698070 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.698037 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5"] Apr 24 23:59:59.706830 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.706805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 24 23:59:59.709219 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.709174 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 24 23:59:59.709365 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.709181 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-9pkfr\"" Apr 24 23:59:59.711045 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.711013 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5"] Apr 24 23:59:59.835704 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.835665 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmpj\" (UniqueName: \"kubernetes.io/projected/7c3b8652-c669-4e33-ae9c-b60cfae028c4-kube-api-access-bxmpj\") pod \"dns-operator-controller-manager-844548ff4c-rkxq5\" (UID: \"7c3b8652-c669-4e33-ae9c-b60cfae028c4\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 24 23:59:59.936591 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.936551 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmpj\" (UniqueName: \"kubernetes.io/projected/7c3b8652-c669-4e33-ae9c-b60cfae028c4-kube-api-access-bxmpj\") pod \"dns-operator-controller-manager-844548ff4c-rkxq5\" (UID: \"7c3b8652-c669-4e33-ae9c-b60cfae028c4\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 24 23:59:59.947585 ip-10-0-133-214 kubenswrapper[2569]: I0424 23:59:59.947556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmpj\" (UniqueName: \"kubernetes.io/projected/7c3b8652-c669-4e33-ae9c-b60cfae028c4-kube-api-access-bxmpj\") pod \"dns-operator-controller-manager-844548ff4c-rkxq5\" (UID: \"7c3b8652-c669-4e33-ae9c-b60cfae028c4\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 25 00:00:00.021085 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.021041 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 25 00:00:00.155667 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.155628 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29617920-jhs4w"] Apr 25 00:00:00.159271 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.159242 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.161764 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.161743 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 25 00:00:00.161885 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.161789 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-zz7t9\"" Apr 25 00:00:00.169420 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.169378 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-jhs4w"] Apr 25 00:00:00.238526 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.238496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86623852-f437-4e94-8774-6652dabed4fb-serviceca\") pod \"image-pruner-29617920-jhs4w\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.238709 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.238604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmgct\" (UniqueName: \"kubernetes.io/projected/86623852-f437-4e94-8774-6652dabed4fb-kube-api-access-xmgct\") pod \"image-pruner-29617920-jhs4w\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.339359 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.339246 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmgct\" (UniqueName: \"kubernetes.io/projected/86623852-f437-4e94-8774-6652dabed4fb-kube-api-access-xmgct\") pod \"image-pruner-29617920-jhs4w\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.339359 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.339327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86623852-f437-4e94-8774-6652dabed4fb-serviceca\") pod \"image-pruner-29617920-jhs4w\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.340093 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.340067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86623852-f437-4e94-8774-6652dabed4fb-serviceca\") pod \"image-pruner-29617920-jhs4w\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.347927 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.347875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmgct\" (UniqueName: \"kubernetes.io/projected/86623852-f437-4e94-8774-6652dabed4fb-kube-api-access-xmgct\") pod \"image-pruner-29617920-jhs4w\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.471009 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.470981 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:00.505204 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.505177 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5"] Apr 25 00:00:00.507146 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:00:00.507119 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3b8652_c669_4e33_ae9c_b60cfae028c4.slice/crio-6e3108d7e8cb10766d4e00ac034da0dd949e23bd8475fd5bd64c142c4b71ce1f WatchSource:0}: Error finding container 6e3108d7e8cb10766d4e00ac034da0dd949e23bd8475fd5bd64c142c4b71ce1f: Status 404 returned error can't find the container with id 6e3108d7e8cb10766d4e00ac034da0dd949e23bd8475fd5bd64c142c4b71ce1f Apr 25 00:00:00.607150 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:00.607124 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-jhs4w"] Apr 25 00:00:00.608594 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:00:00.608567 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86623852_f437_4e94_8774_6652dabed4fb.slice/crio-9510df13b19c5f6d3412c79cc8c2442d75487c97ffcd887d54e716c3dec34882 WatchSource:0}: Error finding container 9510df13b19c5f6d3412c79cc8c2442d75487c97ffcd887d54e716c3dec34882: Status 404 returned error can't find the container with id 9510df13b19c5f6d3412c79cc8c2442d75487c97ffcd887d54e716c3dec34882 Apr 25 00:00:01.280369 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.280330 2569 generic.go:358] "Generic (PLEG): container finished" podID="86623852-f437-4e94-8774-6652dabed4fb" containerID="72a12d4b51ccc342c65ff6bc69eedc6e350e21e0239cfa3c49e042d207331d70" exitCode=0 Apr 25 00:00:01.280940 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.280430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-jhs4w" event={"ID":"86623852-f437-4e94-8774-6652dabed4fb","Type":"ContainerDied","Data":"72a12d4b51ccc342c65ff6bc69eedc6e350e21e0239cfa3c49e042d207331d70"} Apr 25 00:00:01.280940 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.280473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-jhs4w" event={"ID":"86623852-f437-4e94-8774-6652dabed4fb","Type":"ContainerStarted","Data":"9510df13b19c5f6d3412c79cc8c2442d75487c97ffcd887d54e716c3dec34882"} Apr 25 00:00:01.282207 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.282185 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" event={"ID":"ee6bf646-424e-4677-afbe-0c00df0548c7","Type":"ContainerStarted","Data":"f144b5900cee10f19c74c8636890d09f96750ad8eee9d947db80e4e0c47ec373"} Apr 25 00:00:01.282346 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.282309 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 25 00:00:01.283513 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.283490 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" event={"ID":"7c3b8652-c669-4e33-ae9c-b60cfae028c4","Type":"ContainerStarted","Data":"6e3108d7e8cb10766d4e00ac034da0dd949e23bd8475fd5bd64c142c4b71ce1f"} Apr 25 00:00:01.326002 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:01.325914 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" podStartSLOduration=1.817376316 podStartE2EDuration="4.325901036s" podCreationTimestamp="2026-04-24 23:59:57 +0000 UTC" firstStartedPulling="2026-04-24 23:59:57.928265101 +0000 UTC m=+376.532991676" lastFinishedPulling="2026-04-25 00:00:00.436789818 +0000 UTC m=+379.041516396" observedRunningTime="2026-04-25 00:00:01.322998746 +0000 UTC m=+379.927725343" watchObservedRunningTime="2026-04-25 00:00:01.325901036 +0000 UTC m=+379.930627633" Apr 25 00:00:02.243198 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.243162 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp"] Apr 25 00:00:02.246078 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.246057 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:02.248439 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.248398 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-bk84z\"" Apr 25 00:00:02.258026 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.258002 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp"] Apr 25 00:00:02.355011 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.354974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbtlm\" (UniqueName: \"kubernetes.io/projected/2d813bc9-0850-4c30-9d76-eef15935a388-kube-api-access-dbtlm\") pod \"limitador-operator-controller-manager-c7fb4c8d5-645kp\" (UID: \"2d813bc9-0850-4c30-9d76-eef15935a388\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:02.413660 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.413635 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:02.455791 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.455766 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbtlm\" (UniqueName: \"kubernetes.io/projected/2d813bc9-0850-4c30-9d76-eef15935a388-kube-api-access-dbtlm\") pod \"limitador-operator-controller-manager-c7fb4c8d5-645kp\" (UID: \"2d813bc9-0850-4c30-9d76-eef15935a388\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:02.466488 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.466461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbtlm\" (UniqueName: \"kubernetes.io/projected/2d813bc9-0850-4c30-9d76-eef15935a388-kube-api-access-dbtlm\") pod \"limitador-operator-controller-manager-c7fb4c8d5-645kp\" (UID: \"2d813bc9-0850-4c30-9d76-eef15935a388\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:02.556537 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.556511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:02.556663 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.556605 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86623852-f437-4e94-8774-6652dabed4fb-serviceca\") pod \"86623852-f437-4e94-8774-6652dabed4fb\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " Apr 25 00:00:02.556663 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.556646 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmgct\" (UniqueName: \"kubernetes.io/projected/86623852-f437-4e94-8774-6652dabed4fb-kube-api-access-xmgct\") pod \"86623852-f437-4e94-8774-6652dabed4fb\" (UID: \"86623852-f437-4e94-8774-6652dabed4fb\") " Apr 25 00:00:02.556979 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.556950 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86623852-f437-4e94-8774-6652dabed4fb-serviceca" (OuterVolumeSpecName: "serviceca") pod "86623852-f437-4e94-8774-6652dabed4fb" (UID: "86623852-f437-4e94-8774-6652dabed4fb"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:02.558642 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.558618 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86623852-f437-4e94-8774-6652dabed4fb-kube-api-access-xmgct" (OuterVolumeSpecName: "kube-api-access-xmgct") pod "86623852-f437-4e94-8774-6652dabed4fb" (UID: "86623852-f437-4e94-8774-6652dabed4fb"). InnerVolumeSpecName "kube-api-access-xmgct". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:02.657872 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.657844 2569 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86623852-f437-4e94-8774-6652dabed4fb-serviceca\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:02.657872 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.657871 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xmgct\" (UniqueName: \"kubernetes.io/projected/86623852-f437-4e94-8774-6652dabed4fb-kube-api-access-xmgct\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:02.685542 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:02.685517 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp"] Apr 25 00:00:02.687731 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:00:02.687694 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d813bc9_0850_4c30_9d76_eef15935a388.slice/crio-ba9cf7c8ab3e264ba0133ca8c76c7c1bbba1c39fa1de2642f7986e4537cda819 WatchSource:0}: Error finding container ba9cf7c8ab3e264ba0133ca8c76c7c1bbba1c39fa1de2642f7986e4537cda819: Status 404 returned error can't find the container with id ba9cf7c8ab3e264ba0133ca8c76c7c1bbba1c39fa1de2642f7986e4537cda819 Apr 25 00:00:03.292070 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:03.292032 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" event={"ID":"2d813bc9-0850-4c30-9d76-eef15935a388","Type":"ContainerStarted","Data":"ba9cf7c8ab3e264ba0133ca8c76c7c1bbba1c39fa1de2642f7986e4537cda819"} Apr 25 00:00:03.293176 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:03.293147 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-jhs4w" event={"ID":"86623852-f437-4e94-8774-6652dabed4fb","Type":"ContainerDied","Data":"9510df13b19c5f6d3412c79cc8c2442d75487c97ffcd887d54e716c3dec34882"} Apr 25 00:00:03.293176 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:03.293165 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-jhs4w" Apr 25 00:00:03.293307 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:03.293173 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9510df13b19c5f6d3412c79cc8c2442d75487c97ffcd887d54e716c3dec34882" Apr 25 00:00:10.318101 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:10.318064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" event={"ID":"2d813bc9-0850-4c30-9d76-eef15935a388","Type":"ContainerStarted","Data":"091d57b4dec9abad4fe4fe95f6d556f7eabd978fc1ed372d18717d26277144ea"} Apr 25 00:00:10.318504 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:10.318113 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:10.336710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:10.336655 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" podStartSLOduration=1.3413095529999999 podStartE2EDuration="8.336641717s" podCreationTimestamp="2026-04-25 00:00:02 +0000 UTC" firstStartedPulling="2026-04-25 00:00:02.689625424 +0000 UTC m=+381.294351999" lastFinishedPulling="2026-04-25 00:00:09.684957584 +0000 UTC m=+388.289684163" observedRunningTime="2026-04-25 00:00:10.334875427 +0000 UTC m=+388.939602025" watchObservedRunningTime="2026-04-25 00:00:10.336641717 +0000 UTC m=+388.941368315" Apr 25 00:00:12.289626 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:12.289595 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-mzz8b" Apr 25 00:00:18.694472 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.694435 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f77b8c9c8-vch7j"] Apr 25 00:00:18.694871 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.694856 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86623852-f437-4e94-8774-6652dabed4fb" containerName="image-pruner" Apr 25 00:00:18.694914 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.694874 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="86623852-f437-4e94-8774-6652dabed4fb" containerName="image-pruner" Apr 25 00:00:18.694981 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.694970 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="86623852-f437-4e94-8774-6652dabed4fb" containerName="image-pruner" Apr 25 00:00:18.698374 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.698352 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.712389 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.712363 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f77b8c9c8-vch7j"] Apr 25 00:00:18.789535 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-service-ca\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.789712 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-trusted-ca-bundle\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.789712 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789574 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7jj\" (UniqueName: \"kubernetes.io/projected/31d18e68-4a12-4c04-aeee-75cd50a3dd79-kube-api-access-4g7jj\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.789712 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-config\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.789712 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-oauth-serving-cert\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.789928 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789739 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-serving-cert\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.789928 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.789762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-oauth-config\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891054 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.890972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-service-ca\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891054 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-trusted-ca-bundle\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891054 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891052 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7jj\" (UniqueName: \"kubernetes.io/projected/31d18e68-4a12-4c04-aeee-75cd50a3dd79-kube-api-access-4g7jj\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891476 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-config\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891476 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-oauth-serving-cert\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891476 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-serving-cert\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891476 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-oauth-config\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.891896 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-service-ca\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.892007 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-config\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.892007 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-trusted-ca-bundle\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.892007 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.891982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31d18e68-4a12-4c04-aeee-75cd50a3dd79-oauth-serving-cert\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.894102 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.894076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-oauth-config\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.894181 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.894155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31d18e68-4a12-4c04-aeee-75cd50a3dd79-console-serving-cert\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:18.898990 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:18.898969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7jj\" (UniqueName: \"kubernetes.io/projected/31d18e68-4a12-4c04-aeee-75cd50a3dd79-kube-api-access-4g7jj\") pod \"console-6f77b8c9c8-vch7j\" (UID: \"31d18e68-4a12-4c04-aeee-75cd50a3dd79\") " pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:19.008028 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:19.007997 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:19.128826 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:19.128803 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f77b8c9c8-vch7j"] Apr 25 00:00:19.130824 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:00:19.130795 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d18e68_4a12_4c04_aeee_75cd50a3dd79.slice/crio-26c267235e6c3d59f254f86fde4144cf35b39c19084a2c3be194e51b6660335a WatchSource:0}: Error finding container 26c267235e6c3d59f254f86fde4144cf35b39c19084a2c3be194e51b6660335a: Status 404 returned error can't find the container with id 26c267235e6c3d59f254f86fde4144cf35b39c19084a2c3be194e51b6660335a Apr 25 00:00:19.350145 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:19.350109 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f77b8c9c8-vch7j" event={"ID":"31d18e68-4a12-4c04-aeee-75cd50a3dd79","Type":"ContainerStarted","Data":"7e6119f4cd302e3ff64bfe7116e93f19c79d13b37415378418c7533a4d1f43aa"} Apr 25 00:00:19.350145 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:19.350150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f77b8c9c8-vch7j" event={"ID":"31d18e68-4a12-4c04-aeee-75cd50a3dd79","Type":"ContainerStarted","Data":"26c267235e6c3d59f254f86fde4144cf35b39c19084a2c3be194e51b6660335a"} Apr 25 00:00:19.368768 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:19.368721 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f77b8c9c8-vch7j" podStartSLOduration=1.368707799 podStartE2EDuration="1.368707799s" podCreationTimestamp="2026-04-25 00:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:00:19.368445919 +0000 UTC m=+397.973172518" watchObservedRunningTime="2026-04-25 00:00:19.368707799 +0000 UTC m=+397.973434396" Apr 25 00:00:21.325011 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:21.324977 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-645kp" Apr 25 00:00:29.009038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:29.009002 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:29.009038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:29.009042 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:29.013951 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:29.013923 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:29.389373 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:29.389293 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f77b8c9c8-vch7j" Apr 25 00:00:29.454731 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:29.454696 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-786d889dfb-86rvz"] Apr 25 00:00:54.473185 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.473145 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-786d889dfb-86rvz" podUID="6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" containerName="console" containerID="cri-o://1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b" gracePeriod=15 Apr 25 00:00:54.717299 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.717267 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-786d889dfb-86rvz_6a33cb9a-3c1d-4e67-b523-b81a49b1de4e/console/0.log" Apr 25 00:00:54.717448 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.717341 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-786d889dfb-86rvz" Apr 25 00:00:54.768613 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768587 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g92c\" (UniqueName: \"kubernetes.io/projected/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-kube-api-access-4g92c\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.768755 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768622 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-serving-cert\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.768755 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768661 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-service-ca\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.768755 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768689 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-trusted-ca-bundle\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.768871 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768814 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-oauth-serving-cert\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.768871 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768860 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-config\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.768957 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.768902 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-oauth-config\") pod \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\" (UID: \"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e\") " Apr 25 00:00:54.769093 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.769070 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:54.769220 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.769194 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:54.769328 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.769239 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:54.769381 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.769318 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-config" (OuterVolumeSpecName: "console-config") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:54.770840 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.770818 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:00:54.770929 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.770825 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-kube-api-access-4g92c" (OuterVolumeSpecName: "kube-api-access-4g92c") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "kube-api-access-4g92c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:54.770929 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.770877 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" (UID: "6a33cb9a-3c1d-4e67-b523-b81a49b1de4e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:00:54.870185 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870155 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g92c\" (UniqueName: \"kubernetes.io/projected/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-kube-api-access-4g92c\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:54.870185 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870180 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-serving-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:54.870185 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870190 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-service-ca\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:54.870433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870199 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-trusted-ca-bundle\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:54.870433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870208 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-oauth-serving-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:54.870433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870216 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-config\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:54.870433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:54.870225 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e-console-oauth-config\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:00:55.477768 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.477742 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-786d889dfb-86rvz_6a33cb9a-3c1d-4e67-b523-b81a49b1de4e/console/0.log" Apr 25 00:00:55.478257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.477780 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" containerID="1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b" exitCode=2 Apr 25 00:00:55.478257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.477809 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-786d889dfb-86rvz" event={"ID":"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e","Type":"ContainerDied","Data":"1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b"} Apr 25 00:00:55.478257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.477845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-786d889dfb-86rvz" event={"ID":"6a33cb9a-3c1d-4e67-b523-b81a49b1de4e","Type":"ContainerDied","Data":"46f0d05e361bdab10d28bf9f3e87249011564e7c7a4e4587fd2518a2616edc7c"} Apr 25 00:00:55.478257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.477861 2569 scope.go:117] "RemoveContainer" containerID="1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b" Apr 25 00:00:55.478257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.477862 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-786d889dfb-86rvz" Apr 25 00:00:55.486995 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.486980 2569 scope.go:117] "RemoveContainer" containerID="1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b" Apr 25 00:00:55.487232 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:00:55.487214 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b\": container with ID starting with 1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b not found: ID does not exist" containerID="1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b" Apr 25 00:00:55.487308 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.487239 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b"} err="failed to get container status \"1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b\": rpc error: code = NotFound desc = could not find container \"1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b\": container with ID starting with 1d1c1d0c3e5e40a25208161ca305660e57c4eeb83d907c3bb50d0bde6a5ff30b not found: ID does not exist" Apr 25 00:00:55.499341 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.499314 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-786d889dfb-86rvz"] Apr 25 00:00:55.504412 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.504369 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-786d889dfb-86rvz"] Apr 25 00:00:55.977490 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:00:55.977455 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" path="/var/lib/kubelet/pods/6a33cb9a-3c1d-4e67-b523-b81a49b1de4e/volumes" Apr 25 00:01:01.526840 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.526764 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:01.527219 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.527091 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" containerName="console" Apr 25 00:01:01.527219 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.527102 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" containerName="console" Apr 25 00:01:01.527219 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.527196 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a33cb9a-3c1d-4e67-b523-b81a49b1de4e" containerName="console" Apr 25 00:01:01.531589 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.531573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.534015 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.533989 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 25 00:01:01.534124 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.534047 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6768d\"" Apr 25 00:01:01.538714 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.538693 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:01.618160 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.618120 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:01.621027 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.620991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-config-file\") pod \"limitador-limitador-64c8f475fb-9gzlv\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.621167 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.621044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dmn4\" (UniqueName: \"kubernetes.io/projected/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-kube-api-access-6dmn4\") pod \"limitador-limitador-64c8f475fb-9gzlv\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.722062 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.722025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-config-file\") pod \"limitador-limitador-64c8f475fb-9gzlv\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.722231 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.722083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dmn4\" (UniqueName: \"kubernetes.io/projected/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-kube-api-access-6dmn4\") pod \"limitador-limitador-64c8f475fb-9gzlv\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.722770 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.722748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-config-file\") pod \"limitador-limitador-64c8f475fb-9gzlv\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.730229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.730205 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dmn4\" (UniqueName: \"kubernetes.io/projected/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-kube-api-access-6dmn4\") pod \"limitador-limitador-64c8f475fb-9gzlv\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.843042 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.842962 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:01.971057 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:01.971029 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:01.973466 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:01:01.973434 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b68ade_6b47_4cbb_9b89_2d95b0a63d0b.slice/crio-bbaf16cef1e03c4a21a6ef2193e3abfe930d1abc893d1c6b3e8fbac154c23ccb WatchSource:0}: Error finding container bbaf16cef1e03c4a21a6ef2193e3abfe930d1abc893d1c6b3e8fbac154c23ccb: Status 404 returned error can't find the container with id bbaf16cef1e03c4a21a6ef2193e3abfe930d1abc893d1c6b3e8fbac154c23ccb Apr 25 00:01:02.503922 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.503887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" event={"ID":"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b","Type":"ContainerStarted","Data":"bbaf16cef1e03c4a21a6ef2193e3abfe930d1abc893d1c6b3e8fbac154c23ccb"} Apr 25 00:01:02.579323 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.579294 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4bs7s"] Apr 25 00:01:02.582759 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.582739 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:02.585376 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.585358 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tt2fv\"" Apr 25 00:01:02.591741 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.591717 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4bs7s"] Apr 25 00:01:02.628620 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.628593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdnxb\" (UniqueName: \"kubernetes.io/projected/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35-kube-api-access-vdnxb\") pod \"authorino-79cbc94b89-4bs7s\" (UID: \"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35\") " pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:02.729348 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.729315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdnxb\" (UniqueName: \"kubernetes.io/projected/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35-kube-api-access-vdnxb\") pod \"authorino-79cbc94b89-4bs7s\" (UID: \"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35\") " pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:02.737722 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.737690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdnxb\" (UniqueName: \"kubernetes.io/projected/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35-kube-api-access-vdnxb\") pod \"authorino-79cbc94b89-4bs7s\" (UID: \"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35\") " pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:02.892682 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:02.892611 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:02.966900 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:02.964574 2569 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="registry.redhat.io/rhcl-1/dns-rhel9-operator@sha256:b4e7ba67509320ca9ac5d63cc4add987fad05b098c4a7cd8dd91f264731177cf" Apr 25 00:01:02.966900 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:02.964854 2569 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/rhcl-1/dns-rhel9-operator@sha256:b4e7ba67509320ca9ac5d63cc4add987fad05b098c4a7cd8dd91f264731177cf,Command:[/manager],Args:[--metrics-bind-address=:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:pprof,HostPort:0,ContainerPort:8082,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACES,Value:,ValueFrom:nil,},EnvVar{Name:CLUSTER_SECRET_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:dns-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxmpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:dns-operator-controller-env,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dns-operator-controller-manager-844548ff4c-rkxq5_kuadrant-system(7c3b8652-c669-4e33-ae9c-b60cfae028c4): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 25 00:01:02.967276 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:02.967213 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" podUID="7c3b8652-c669-4e33-ae9c-b60cfae028c4" Apr 25 00:01:03.033687 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:03.033659 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4bs7s"] Apr 25 00:01:03.035177 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:01:03.035147 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c8e685f_f1db_4427_a5c9_0a5dd53e9e35.slice/crio-bdaaed00cfdc94eda94fdef40fe9e31a65c053766bd8a57bdf2e37695e08d99d WatchSource:0}: Error finding container bdaaed00cfdc94eda94fdef40fe9e31a65c053766bd8a57bdf2e37695e08d99d: Status 404 returned error can't find the container with id bdaaed00cfdc94eda94fdef40fe9e31a65c053766bd8a57bdf2e37695e08d99d Apr 25 00:01:03.509485 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:03.509445 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" event={"ID":"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35","Type":"ContainerStarted","Data":"bdaaed00cfdc94eda94fdef40fe9e31a65c053766bd8a57bdf2e37695e08d99d"} Apr 25 00:01:03.510906 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:03.510849 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/rhcl-1/dns-rhel9-operator@sha256:b4e7ba67509320ca9ac5d63cc4add987fad05b098c4a7cd8dd91f264731177cf\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" podUID="7c3b8652-c669-4e33-ae9c-b60cfae028c4" Apr 25 00:01:08.531141 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:08.531103 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" event={"ID":"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35","Type":"ContainerStarted","Data":"bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c"} Apr 25 00:01:08.532357 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:08.532334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" event={"ID":"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b","Type":"ContainerStarted","Data":"8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5"} Apr 25 00:01:08.532487 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:08.532471 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:08.546244 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:08.546202 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" podStartSLOduration=1.788169463 podStartE2EDuration="6.546190939s" podCreationTimestamp="2026-04-25 00:01:02 +0000 UTC" firstStartedPulling="2026-04-25 00:01:03.036486221 +0000 UTC m=+441.641212799" lastFinishedPulling="2026-04-25 00:01:07.794507694 +0000 UTC m=+446.399234275" observedRunningTime="2026-04-25 00:01:08.544884258 +0000 UTC m=+447.149610856" watchObservedRunningTime="2026-04-25 00:01:08.546190939 +0000 UTC m=+447.150917566" Apr 25 00:01:08.561827 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:08.561785 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" podStartSLOduration=1.792847721 podStartE2EDuration="7.561770202s" podCreationTimestamp="2026-04-25 00:01:01 +0000 UTC" firstStartedPulling="2026-04-25 00:01:01.975381412 +0000 UTC m=+440.580107988" lastFinishedPulling="2026-04-25 00:01:07.744303889 +0000 UTC m=+446.349030469" observedRunningTime="2026-04-25 00:01:08.559621226 +0000 UTC m=+447.164347825" watchObservedRunningTime="2026-04-25 00:01:08.561770202 +0000 UTC m=+447.166496800" Apr 25 00:01:18.048755 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.048731 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 25 00:01:18.455150 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.455064 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:18.455511 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.455305 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" podUID="38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" containerName="limitador" containerID="cri-o://8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5" gracePeriod=30 Apr 25 00:01:18.457214 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.457193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:18.570604 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.570565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" event={"ID":"7c3b8652-c669-4e33-ae9c-b60cfae028c4","Type":"ContainerStarted","Data":"c454073d67e1e31595e4916589375acb51e23188d7eff703dc044d5fafab6b70"} Apr 25 00:01:18.570796 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.570775 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 25 00:01:18.588370 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.588326 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" podStartSLOduration=2.051311518 podStartE2EDuration="1m19.588314703s" podCreationTimestamp="2026-04-24 23:59:59 +0000 UTC" firstStartedPulling="2026-04-25 00:00:00.509247427 +0000 UTC m=+379.113974003" lastFinishedPulling="2026-04-25 00:01:18.046250597 +0000 UTC m=+456.650977188" observedRunningTime="2026-04-25 00:01:18.586582709 +0000 UTC m=+457.191309520" watchObservedRunningTime="2026-04-25 00:01:18.588314703 +0000 UTC m=+457.193041301" Apr 25 00:01:18.989575 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:18.989550 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:19.169600 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.169521 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dmn4\" (UniqueName: \"kubernetes.io/projected/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-kube-api-access-6dmn4\") pod \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " Apr 25 00:01:19.169919 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.169603 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-config-file\") pod \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\" (UID: \"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b\") " Apr 25 00:01:19.169964 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.169921 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-config-file" (OuterVolumeSpecName: "config-file") pod "38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" (UID: "38b68ade-6b47-4cbb-9b89-2d95b0a63d0b"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:01:19.171576 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.171547 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-kube-api-access-6dmn4" (OuterVolumeSpecName: "kube-api-access-6dmn4") pod "38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" (UID: "38b68ade-6b47-4cbb-9b89-2d95b0a63d0b"). InnerVolumeSpecName "kube-api-access-6dmn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:01:19.270739 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.270707 2569 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-config-file\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:19.270739 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.270735 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dmn4\" (UniqueName: \"kubernetes.io/projected/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b-kube-api-access-6dmn4\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:19.575315 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.575273 2569 generic.go:358] "Generic (PLEG): container finished" podID="38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" containerID="8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5" exitCode=0 Apr 25 00:01:19.575506 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.575328 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" Apr 25 00:01:19.575506 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.575357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" event={"ID":"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b","Type":"ContainerDied","Data":"8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5"} Apr 25 00:01:19.575506 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.575424 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9gzlv" event={"ID":"38b68ade-6b47-4cbb-9b89-2d95b0a63d0b","Type":"ContainerDied","Data":"bbaf16cef1e03c4a21a6ef2193e3abfe930d1abc893d1c6b3e8fbac154c23ccb"} Apr 25 00:01:19.575506 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.575445 2569 scope.go:117] "RemoveContainer" containerID="8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5" Apr 25 00:01:19.583907 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.583890 2569 scope.go:117] "RemoveContainer" containerID="8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5" Apr 25 00:01:19.584112 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:19.584094 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5\": container with ID starting with 8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5 not found: ID does not exist" containerID="8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5" Apr 25 00:01:19.584160 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.584121 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5"} err="failed to get container status \"8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5\": rpc error: code = NotFound desc = could not find container \"8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5\": container with ID starting with 8b65b3569d7495c485e0f173ed053fad25b09a8b2222b39f02f731683f96daf5 not found: ID does not exist" Apr 25 00:01:19.597191 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.597166 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:19.602263 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.602240 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9gzlv"] Apr 25 00:01:19.977628 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:19.977559 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" path="/var/lib/kubelet/pods/38b68ade-6b47-4cbb-9b89-2d95b0a63d0b/volumes" Apr 25 00:01:29.577626 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:29.577592 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-rkxq5" Apr 25 00:01:30.567698 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:30.567665 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4bs7s"] Apr 25 00:01:30.567943 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:30.567916 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" podUID="9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" containerName="authorino" containerID="cri-o://bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c" gracePeriod=30 Apr 25 00:01:30.804801 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:30.804777 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:30.966452 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:30.966345 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdnxb\" (UniqueName: \"kubernetes.io/projected/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35-kube-api-access-vdnxb\") pod \"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35\" (UID: \"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35\") " Apr 25 00:01:30.968364 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:30.968332 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35-kube-api-access-vdnxb" (OuterVolumeSpecName: "kube-api-access-vdnxb") pod "9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" (UID: "9c8e685f-f1db-4427-a5c9-0a5dd53e9e35"). InnerVolumeSpecName "kube-api-access-vdnxb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:01:31.067142 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.067113 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdnxb\" (UniqueName: \"kubernetes.io/projected/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35-kube-api-access-vdnxb\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:31.619951 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.619916 2569 generic.go:358] "Generic (PLEG): container finished" podID="9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" containerID="bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c" exitCode=0 Apr 25 00:01:31.620141 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.619956 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" event={"ID":"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35","Type":"ContainerDied","Data":"bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c"} Apr 25 00:01:31.620141 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.619963 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" Apr 25 00:01:31.620141 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.619984 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4bs7s" event={"ID":"9c8e685f-f1db-4427-a5c9-0a5dd53e9e35","Type":"ContainerDied","Data":"bdaaed00cfdc94eda94fdef40fe9e31a65c053766bd8a57bdf2e37695e08d99d"} Apr 25 00:01:31.620141 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.620002 2569 scope.go:117] "RemoveContainer" containerID="bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c" Apr 25 00:01:31.628704 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.628687 2569 scope.go:117] "RemoveContainer" containerID="bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c" Apr 25 00:01:31.628953 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:31.628933 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c\": container with ID starting with bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c not found: ID does not exist" containerID="bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c" Apr 25 00:01:31.629001 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.628963 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c"} err="failed to get container status \"bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c\": rpc error: code = NotFound desc = could not find container \"bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c\": container with ID starting with bb49999e5814445e252edb21444ab47b6aee159ee2a9e326bfadfb28c325738c not found: ID does not exist" Apr 25 00:01:31.640006 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.639977 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4bs7s"] Apr 25 00:01:31.643385 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.643364 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4bs7s"] Apr 25 00:01:31.976516 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:31.976437 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" path="/var/lib/kubelet/pods/9c8e685f-f1db-4427-a5c9-0a5dd53e9e35/volumes" Apr 25 00:01:37.751601 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751567 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq"] Apr 25 00:01:37.752004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751870 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" containerName="limitador" Apr 25 00:01:37.752004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751880 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" containerName="limitador" Apr 25 00:01:37.752004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751902 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" containerName="authorino" Apr 25 00:01:37.752004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751908 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" containerName="authorino" Apr 25 00:01:37.752004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751952 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c8e685f-f1db-4427-a5c9-0a5dd53e9e35" containerName="authorino" Apr 25 00:01:37.752004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.751961 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="38b68ade-6b47-4cbb-9b89-2d95b0a63d0b" containerName="limitador" Apr 25 00:01:37.767572 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.767549 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.779207 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.779181 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq"] Apr 25 00:01:37.814294 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814260 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.814475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/58046be0-da36-4df4-af44-0f6f68c596ea-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.814475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814353 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/58046be0-da36-4df4-af44-0f6f68c596ea-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.814475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.814475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62bsg\" (UniqueName: \"kubernetes.io/projected/58046be0-da36-4df4-af44-0f6f68c596ea-kube-api-access-62bsg\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.814475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/58046be0-da36-4df4-af44-0f6f68c596ea-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.814674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.814487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915451 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915393 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/58046be0-da36-4df4-af44-0f6f68c596ea-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915646 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915646 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915646 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915573 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/58046be0-da36-4df4-af44-0f6f68c596ea-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915646 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/58046be0-da36-4df4-af44-0f6f68c596ea-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915863 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.915863 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.915787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62bsg\" (UniqueName: \"kubernetes.io/projected/58046be0-da36-4df4-af44-0f6f68c596ea-kube-api-access-62bsg\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.916456 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.916425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/58046be0-da36-4df4-af44-0f6f68c596ea-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.918036 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.918005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.918149 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.918005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/58046be0-da36-4df4-af44-0f6f68c596ea-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.918149 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.918043 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.918149 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.918117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/58046be0-da36-4df4-af44-0f6f68c596ea-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.923655 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.923628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/58046be0-da36-4df4-af44-0f6f68c596ea-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:37.923891 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:37.923875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62bsg\" (UniqueName: \"kubernetes.io/projected/58046be0-da36-4df4-af44-0f6f68c596ea-kube-api-access-62bsg\") pod \"istiod-openshift-gateway-55ff986f96-2lzjq\" (UID: \"58046be0-da36-4df4-af44-0f6f68c596ea\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:38.077131 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.077045 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:38.416567 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.416382 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq"] Apr 25 00:01:38.418847 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:01:38.418814 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58046be0_da36_4df4_af44_0f6f68c596ea.slice/crio-c63bd9fed590828e360a9856d7c9edb1beb1a5160db8ba1def27b3ba3a122cac WatchSource:0}: Error finding container c63bd9fed590828e360a9856d7c9edb1beb1a5160db8ba1def27b3ba3a122cac: Status 404 returned error can't find the container with id c63bd9fed590828e360a9856d7c9edb1beb1a5160db8ba1def27b3ba3a122cac Apr 25 00:01:38.420885 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.420848 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 25 00:01:38.420978 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.420915 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 25 00:01:38.649560 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.649523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" event={"ID":"58046be0-da36-4df4-af44-0f6f68c596ea","Type":"ContainerStarted","Data":"3d6c210a06d6e982c925b86adda35f262f4ccfc8c22f1c926b0177c5af9f7734"} Apr 25 00:01:38.649560 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.649563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" event={"ID":"58046be0-da36-4df4-af44-0f6f68c596ea","Type":"ContainerStarted","Data":"c63bd9fed590828e360a9856d7c9edb1beb1a5160db8ba1def27b3ba3a122cac"} Apr 25 00:01:38.649767 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.649612 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:38.674521 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:38.674426 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" podStartSLOduration=1.67441154 podStartE2EDuration="1.67441154s" podCreationTimestamp="2026-04-25 00:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:01:38.671483354 +0000 UTC m=+477.276209952" watchObservedRunningTime="2026-04-25 00:01:38.67441154 +0000 UTC m=+477.279138129" Apr 25 00:01:39.655576 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:39.655544 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-2lzjq" Apr 25 00:01:39.765025 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:39.764989 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc"] Apr 25 00:01:39.765267 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:39.765242 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" podUID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" containerName="discovery" containerID="cri-o://44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31" gracePeriod=30 Apr 25 00:01:40.014064 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.014038 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 25 00:01:40.033830 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.033804 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-dns-cert\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.033974 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.033863 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxlt\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-kube-api-access-gzxlt\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.033974 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.033884 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-cacerts\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.033974 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.033918 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-local-certs\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.033974 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.033940 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-token\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.034229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.034070 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-ca-configmap\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.034229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.034112 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-kubeconfig\") pod \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\" (UID: \"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2\") " Apr 25 00:01:40.034503 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.034471 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:01:40.037062 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.036842 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-local-certs" (OuterVolumeSpecName: "local-certs") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:01:40.037062 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.036895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-token" (OuterVolumeSpecName: "istio-token") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:01:40.037062 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.036943 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:01:40.037062 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.037026 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-kube-api-access-gzxlt" (OuterVolumeSpecName: "kube-api-access-gzxlt") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "kube-api-access-gzxlt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:01:40.037683 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.037649 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-cacerts" (OuterVolumeSpecName: "cacerts") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:01:40.037822 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.037701 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" (UID: "a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:01:40.135295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135246 2569 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-token\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.135295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135292 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-ca-configmap\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.135295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135305 2569 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-kubeconfig\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.135295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135314 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-istio-csr-dns-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.135597 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135323 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzxlt\" (UniqueName: \"kubernetes.io/projected/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-kube-api-access-gzxlt\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.135597 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135332 2569 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-cacerts\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.135597 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.135343 2569 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2-local-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:01:40.659316 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.659281 2569 generic.go:358] "Generic (PLEG): container finished" podID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" containerID="44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31" exitCode=0 Apr 25 00:01:40.659927 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.659343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" Apr 25 00:01:40.659927 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.659376 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" event={"ID":"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2","Type":"ContainerDied","Data":"44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31"} Apr 25 00:01:40.659927 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.659433 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc" event={"ID":"a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2","Type":"ContainerDied","Data":"a37a2cf92f55709d6e53e33d697ff2a3e5bb3bdc0ab718e81fd887622ecc73c6"} Apr 25 00:01:40.659927 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.659455 2569 scope.go:117] "RemoveContainer" containerID="44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31" Apr 25 00:01:40.668534 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.668514 2569 scope.go:117] "RemoveContainer" containerID="44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31" Apr 25 00:01:40.668766 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:40.668747 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31\": container with ID starting with 44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31 not found: ID does not exist" containerID="44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31" Apr 25 00:01:40.668812 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.668775 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31"} err="failed to get container status \"44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31\": rpc error: code = NotFound desc = could not find container \"44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31\": container with ID starting with 44ff435f79c5960551a0bafa614cdb59b0ac6c37ea2f3bd37360e9aa1cea7f31 not found: ID does not exist" Apr 25 00:01:40.690170 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.690140 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc"] Apr 25 00:01:40.700212 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:40.700185 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-q42qc"] Apr 25 00:01:41.977096 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:41.977065 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" path="/var/lib/kubelet/pods/a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2/volumes" Apr 25 00:01:48.623443 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.623393 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-6r5k6"] Apr 25 00:01:48.623896 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.623847 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" containerName="discovery" Apr 25 00:01:48.623896 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.623861 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" containerName="discovery" Apr 25 00:01:48.623996 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.623943 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0a9deeb-3889-4e4b-bca0-c25ded8bb5f2" containerName="discovery" Apr 25 00:01:48.633383 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.633351 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-6r5k6"] Apr 25 00:01:48.633562 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.633489 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:48.635707 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.635687 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 25 00:01:48.635839 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.635812 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 25 00:01:48.635897 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.635850 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-k9psc\"" Apr 25 00:01:48.636147 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.636130 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 25 00:01:48.640732 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.640711 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7689784d4c-wg8df"] Apr 25 00:01:48.644210 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.644194 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.646415 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.646384 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 25 00:01:48.646517 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.646485 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-j29lq\"" Apr 25 00:01:48.653381 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.653361 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7689784d4c-wg8df"] Apr 25 00:01:48.661004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.660983 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-qcgmd"] Apr 25 00:01:48.664144 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.664118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.669297 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.669277 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 25 00:01:48.669416 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.669343 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-j8qqq\"" Apr 25 00:01:48.679539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.679513 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-qcgmd"] Apr 25 00:01:48.700652 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.700619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:48.700803 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.700661 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxtd\" (UniqueName: \"kubernetes.io/projected/19114d5b-c74f-462f-82b2-eafccb3d8d4e-kube-api-access-5bxtd\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:48.700803 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.700754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/599c0f54-e37c-4031-bd43-1e7c027ea21b-cert\") pod \"llmisvc-controller-manager-7689784d4c-wg8df\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.700803 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.700780 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2krs\" (UniqueName: \"kubernetes.io/projected/adc22d8c-100e-4e57-bcad-4319af5e0d4f-kube-api-access-q2krs\") pod \"seaweedfs-86cc847c5c-qcgmd\" (UID: \"adc22d8c-100e-4e57-bcad-4319af5e0d4f\") " pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.700955 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.700818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/adc22d8c-100e-4e57-bcad-4319af5e0d4f-data\") pod \"seaweedfs-86cc847c5c-qcgmd\" (UID: \"adc22d8c-100e-4e57-bcad-4319af5e0d4f\") " pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.700955 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.700870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5g8g\" (UniqueName: \"kubernetes.io/projected/599c0f54-e37c-4031-bd43-1e7c027ea21b-kube-api-access-m5g8g\") pod \"llmisvc-controller-manager-7689784d4c-wg8df\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.801497 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.801466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/599c0f54-e37c-4031-bd43-1e7c027ea21b-cert\") pod \"llmisvc-controller-manager-7689784d4c-wg8df\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.801497 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.801501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2krs\" (UniqueName: \"kubernetes.io/projected/adc22d8c-100e-4e57-bcad-4319af5e0d4f-kube-api-access-q2krs\") pod \"seaweedfs-86cc847c5c-qcgmd\" (UID: \"adc22d8c-100e-4e57-bcad-4319af5e0d4f\") " pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.801752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.801538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/adc22d8c-100e-4e57-bcad-4319af5e0d4f-data\") pod \"seaweedfs-86cc847c5c-qcgmd\" (UID: \"adc22d8c-100e-4e57-bcad-4319af5e0d4f\") " pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.801752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.801581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5g8g\" (UniqueName: \"kubernetes.io/projected/599c0f54-e37c-4031-bd43-1e7c027ea21b-kube-api-access-m5g8g\") pod \"llmisvc-controller-manager-7689784d4c-wg8df\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.801752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.801618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:48.801752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.801658 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxtd\" (UniqueName: \"kubernetes.io/projected/19114d5b-c74f-462f-82b2-eafccb3d8d4e-kube-api-access-5bxtd\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:48.801960 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:48.801766 2569 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 25 00:01:48.801960 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:01:48.801851 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert podName:19114d5b-c74f-462f-82b2-eafccb3d8d4e nodeName:}" failed. No retries permitted until 2026-04-25 00:01:49.301832059 +0000 UTC m=+487.906558652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert") pod "kserve-controller-manager-64c4d9588d-6r5k6" (UID: "19114d5b-c74f-462f-82b2-eafccb3d8d4e") : secret "kserve-webhook-server-cert" not found Apr 25 00:01:48.802077 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.802033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/adc22d8c-100e-4e57-bcad-4319af5e0d4f-data\") pod \"seaweedfs-86cc847c5c-qcgmd\" (UID: \"adc22d8c-100e-4e57-bcad-4319af5e0d4f\") " pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.803961 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.803934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/599c0f54-e37c-4031-bd43-1e7c027ea21b-cert\") pod \"llmisvc-controller-manager-7689784d4c-wg8df\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.812330 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.812288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5g8g\" (UniqueName: \"kubernetes.io/projected/599c0f54-e37c-4031-bd43-1e7c027ea21b-kube-api-access-m5g8g\") pod \"llmisvc-controller-manager-7689784d4c-wg8df\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.813677 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.813650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxtd\" (UniqueName: \"kubernetes.io/projected/19114d5b-c74f-462f-82b2-eafccb3d8d4e-kube-api-access-5bxtd\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:48.813931 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.813914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2krs\" (UniqueName: \"kubernetes.io/projected/adc22d8c-100e-4e57-bcad-4319af5e0d4f-kube-api-access-q2krs\") pod \"seaweedfs-86cc847c5c-qcgmd\" (UID: \"adc22d8c-100e-4e57-bcad-4319af5e0d4f\") " pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:48.958922 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.958844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:48.973700 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:48.973670 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:49.105942 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.105907 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7689784d4c-wg8df"] Apr 25 00:01:49.107359 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:01:49.107334 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod599c0f54_e37c_4031_bd43_1e7c027ea21b.slice/crio-a451483371b9fb7bd33b76276a18de6b33847111d07d706f0ff790142f574460 WatchSource:0}: Error finding container a451483371b9fb7bd33b76276a18de6b33847111d07d706f0ff790142f574460: Status 404 returned error can't find the container with id a451483371b9fb7bd33b76276a18de6b33847111d07d706f0ff790142f574460 Apr 25 00:01:49.127372 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.127346 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-qcgmd"] Apr 25 00:01:49.130302 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:01:49.130273 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadc22d8c_100e_4e57_bcad_4319af5e0d4f.slice/crio-e31d139b464f1af1480862e9da8962bd5143781594d16309fffb4f0ebc86409f WatchSource:0}: Error finding container e31d139b464f1af1480862e9da8962bd5143781594d16309fffb4f0ebc86409f: Status 404 returned error can't find the container with id e31d139b464f1af1480862e9da8962bd5143781594d16309fffb4f0ebc86409f Apr 25 00:01:49.307072 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.307038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:49.309443 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.309419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert\") pod \"kserve-controller-manager-64c4d9588d-6r5k6\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:49.547488 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.547455 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:49.696297 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.696244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-qcgmd" event={"ID":"adc22d8c-100e-4e57-bcad-4319af5e0d4f","Type":"ContainerStarted","Data":"e31d139b464f1af1480862e9da8962bd5143781594d16309fffb4f0ebc86409f"} Apr 25 00:01:49.698603 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.698568 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" event={"ID":"599c0f54-e37c-4031-bd43-1e7c027ea21b","Type":"ContainerStarted","Data":"a451483371b9fb7bd33b76276a18de6b33847111d07d706f0ff790142f574460"} Apr 25 00:01:49.749231 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:49.749193 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-6r5k6"] Apr 25 00:01:49.753563 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:01:49.753522 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19114d5b_c74f_462f_82b2_eafccb3d8d4e.slice/crio-7673e255aecb63f2440e3ffcd6be512a45675e59e2687893640991749b4c2913 WatchSource:0}: Error finding container 7673e255aecb63f2440e3ffcd6be512a45675e59e2687893640991749b4c2913: Status 404 returned error can't find the container with id 7673e255aecb63f2440e3ffcd6be512a45675e59e2687893640991749b4c2913 Apr 25 00:01:50.708719 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:50.708681 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" event={"ID":"19114d5b-c74f-462f-82b2-eafccb3d8d4e","Type":"ContainerStarted","Data":"7673e255aecb63f2440e3ffcd6be512a45675e59e2687893640991749b4c2913"} Apr 25 00:01:54.727584 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.727540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-qcgmd" event={"ID":"adc22d8c-100e-4e57-bcad-4319af5e0d4f","Type":"ContainerStarted","Data":"428fc970b7110693730ff5efd5649aadb1fe818198368ace48dad629c88a401b"} Apr 25 00:01:54.728026 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.727871 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:01:54.729296 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.729259 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" event={"ID":"19114d5b-c74f-462f-82b2-eafccb3d8d4e","Type":"ContainerStarted","Data":"a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441"} Apr 25 00:01:54.729454 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.729371 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:01:54.730715 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.730692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" event={"ID":"599c0f54-e37c-4031-bd43-1e7c027ea21b","Type":"ContainerStarted","Data":"eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9"} Apr 25 00:01:54.730863 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.730849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:01:54.742934 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.742894 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-qcgmd" podStartSLOduration=1.8672653399999999 podStartE2EDuration="6.742881439s" podCreationTimestamp="2026-04-25 00:01:48 +0000 UTC" firstStartedPulling="2026-04-25 00:01:49.131604857 +0000 UTC m=+487.736331436" lastFinishedPulling="2026-04-25 00:01:54.007220959 +0000 UTC m=+492.611947535" observedRunningTime="2026-04-25 00:01:54.741851715 +0000 UTC m=+493.346578314" watchObservedRunningTime="2026-04-25 00:01:54.742881439 +0000 UTC m=+493.347608059" Apr 25 00:01:54.758519 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.758473 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" podStartSLOduration=1.898731818 podStartE2EDuration="6.758460534s" podCreationTimestamp="2026-04-25 00:01:48 +0000 UTC" firstStartedPulling="2026-04-25 00:01:49.108484096 +0000 UTC m=+487.713210672" lastFinishedPulling="2026-04-25 00:01:53.968212808 +0000 UTC m=+492.572939388" observedRunningTime="2026-04-25 00:01:54.755961823 +0000 UTC m=+493.360688421" watchObservedRunningTime="2026-04-25 00:01:54.758460534 +0000 UTC m=+493.363187132" Apr 25 00:01:54.771453 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:01:54.771395 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" podStartSLOduration=2.559453466 podStartE2EDuration="6.771383983s" podCreationTimestamp="2026-04-25 00:01:48 +0000 UTC" firstStartedPulling="2026-04-25 00:01:49.755724583 +0000 UTC m=+488.360451166" lastFinishedPulling="2026-04-25 00:01:53.967655091 +0000 UTC m=+492.572381683" observedRunningTime="2026-04-25 00:01:54.770660014 +0000 UTC m=+493.375386612" watchObservedRunningTime="2026-04-25 00:01:54.771383983 +0000 UTC m=+493.376110580" Apr 25 00:02:00.737363 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:00.737335 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-qcgmd" Apr 25 00:02:25.736788 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:25.736756 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:02:25.739929 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:25.739910 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:02:27.058105 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.058072 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-6r5k6"] Apr 25 00:02:27.058563 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.058270 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" podUID="19114d5b-c74f-462f-82b2-eafccb3d8d4e" containerName="manager" containerID="cri-o://a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441" gracePeriod=10 Apr 25 00:02:27.083350 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.083320 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-pj2cx"] Apr 25 00:02:27.134665 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.134636 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-pj2cx"] Apr 25 00:02:27.134807 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.134758 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.234038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.234005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrdmm\" (UniqueName: \"kubernetes.io/projected/240d3421-bb02-433c-8bfc-50a8cc3a6eff-kube-api-access-rrdmm\") pod \"kserve-controller-manager-64c4d9588d-pj2cx\" (UID: \"240d3421-bb02-433c-8bfc-50a8cc3a6eff\") " pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.234201 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.234130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/240d3421-bb02-433c-8bfc-50a8cc3a6eff-cert\") pod \"kserve-controller-manager-64c4d9588d-pj2cx\" (UID: \"240d3421-bb02-433c-8bfc-50a8cc3a6eff\") " pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.301461 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.301438 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:02:27.334892 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.334823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/240d3421-bb02-433c-8bfc-50a8cc3a6eff-cert\") pod \"kserve-controller-manager-64c4d9588d-pj2cx\" (UID: \"240d3421-bb02-433c-8bfc-50a8cc3a6eff\") " pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.334892 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.334865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrdmm\" (UniqueName: \"kubernetes.io/projected/240d3421-bb02-433c-8bfc-50a8cc3a6eff-kube-api-access-rrdmm\") pod \"kserve-controller-manager-64c4d9588d-pj2cx\" (UID: \"240d3421-bb02-433c-8bfc-50a8cc3a6eff\") " pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.337445 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.337416 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/240d3421-bb02-433c-8bfc-50a8cc3a6eff-cert\") pod \"kserve-controller-manager-64c4d9588d-pj2cx\" (UID: \"240d3421-bb02-433c-8bfc-50a8cc3a6eff\") " pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.342844 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.342791 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrdmm\" (UniqueName: \"kubernetes.io/projected/240d3421-bb02-433c-8bfc-50a8cc3a6eff-kube-api-access-rrdmm\") pod \"kserve-controller-manager-64c4d9588d-pj2cx\" (UID: \"240d3421-bb02-433c-8bfc-50a8cc3a6eff\") " pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.435610 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.435567 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert\") pod \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " Apr 25 00:02:27.435796 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.435668 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bxtd\" (UniqueName: \"kubernetes.io/projected/19114d5b-c74f-462f-82b2-eafccb3d8d4e-kube-api-access-5bxtd\") pod \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\" (UID: \"19114d5b-c74f-462f-82b2-eafccb3d8d4e\") " Apr 25 00:02:27.437787 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.437759 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert" (OuterVolumeSpecName: "cert") pod "19114d5b-c74f-462f-82b2-eafccb3d8d4e" (UID: "19114d5b-c74f-462f-82b2-eafccb3d8d4e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:27.437890 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.437782 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19114d5b-c74f-462f-82b2-eafccb3d8d4e-kube-api-access-5bxtd" (OuterVolumeSpecName: "kube-api-access-5bxtd") pod "19114d5b-c74f-462f-82b2-eafccb3d8d4e" (UID: "19114d5b-c74f-462f-82b2-eafccb3d8d4e"). InnerVolumeSpecName "kube-api-access-5bxtd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:27.506271 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.506239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:27.536169 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.536137 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19114d5b-c74f-462f-82b2-eafccb3d8d4e-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:02:27.536169 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.536167 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bxtd\" (UniqueName: \"kubernetes.io/projected/19114d5b-c74f-462f-82b2-eafccb3d8d4e-kube-api-access-5bxtd\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:02:27.626892 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.626861 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-pj2cx"] Apr 25 00:02:27.628730 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:02:27.628703 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod240d3421_bb02_433c_8bfc_50a8cc3a6eff.slice/crio-5aeca2ab0cf8d78f6bf3b04831420a63ea6b1c1577a6146741efe94bf2186965 WatchSource:0}: Error finding container 5aeca2ab0cf8d78f6bf3b04831420a63ea6b1c1577a6146741efe94bf2186965: Status 404 returned error can't find the container with id 5aeca2ab0cf8d78f6bf3b04831420a63ea6b1c1577a6146741efe94bf2186965 Apr 25 00:02:27.850572 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.850535 2569 generic.go:358] "Generic (PLEG): container finished" podID="19114d5b-c74f-462f-82b2-eafccb3d8d4e" containerID="a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441" exitCode=0 Apr 25 00:02:27.850757 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.850590 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" Apr 25 00:02:27.850757 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.850598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" event={"ID":"19114d5b-c74f-462f-82b2-eafccb3d8d4e","Type":"ContainerDied","Data":"a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441"} Apr 25 00:02:27.850757 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.850643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-6r5k6" event={"ID":"19114d5b-c74f-462f-82b2-eafccb3d8d4e","Type":"ContainerDied","Data":"7673e255aecb63f2440e3ffcd6be512a45675e59e2687893640991749b4c2913"} Apr 25 00:02:27.850757 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.850667 2569 scope.go:117] "RemoveContainer" containerID="a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441" Apr 25 00:02:27.851837 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.851819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" event={"ID":"240d3421-bb02-433c-8bfc-50a8cc3a6eff","Type":"ContainerStarted","Data":"5aeca2ab0cf8d78f6bf3b04831420a63ea6b1c1577a6146741efe94bf2186965"} Apr 25 00:02:27.859256 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.859230 2569 scope.go:117] "RemoveContainer" containerID="a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441" Apr 25 00:02:27.859608 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:02:27.859585 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441\": container with ID starting with a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441 not found: ID does not exist" containerID="a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441" Apr 25 00:02:27.859671 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.859619 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441"} err="failed to get container status \"a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441\": rpc error: code = NotFound desc = could not find container \"a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441\": container with ID starting with a54aa35e7448d855159619a52d88ee972e465273ad9d382b2f8164d235698441 not found: ID does not exist" Apr 25 00:02:27.872482 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.872460 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-6r5k6"] Apr 25 00:02:27.876201 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.876150 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-6r5k6"] Apr 25 00:02:27.977603 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:27.977573 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19114d5b-c74f-462f-82b2-eafccb3d8d4e" path="/var/lib/kubelet/pods/19114d5b-c74f-462f-82b2-eafccb3d8d4e/volumes" Apr 25 00:02:28.857579 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:28.857543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" event={"ID":"240d3421-bb02-433c-8bfc-50a8cc3a6eff","Type":"ContainerStarted","Data":"8a9a84a2e320958e8b8bde6a1e0510437e110a79982959acde519a5f766d4022"} Apr 25 00:02:28.857962 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:28.857638 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:02:28.874452 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:28.874386 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" podStartSLOduration=1.474608635 podStartE2EDuration="1.874374276s" podCreationTimestamp="2026-04-25 00:02:27 +0000 UTC" firstStartedPulling="2026-04-25 00:02:27.63001647 +0000 UTC m=+526.234743047" lastFinishedPulling="2026-04-25 00:02:28.029782098 +0000 UTC m=+526.634508688" observedRunningTime="2026-04-25 00:02:28.87308556 +0000 UTC m=+527.477812158" watchObservedRunningTime="2026-04-25 00:02:28.874374276 +0000 UTC m=+527.479100873" Apr 25 00:02:59.865809 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:02:59.865772 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-pj2cx" Apr 25 00:03:00.706782 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.706745 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-5c775"] Apr 25 00:03:00.707330 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.707305 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19114d5b-c74f-462f-82b2-eafccb3d8d4e" containerName="manager" Apr 25 00:03:00.707330 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.707329 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="19114d5b-c74f-462f-82b2-eafccb3d8d4e" containerName="manager" Apr 25 00:03:00.707513 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.707426 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="19114d5b-c74f-462f-82b2-eafccb3d8d4e" containerName="manager" Apr 25 00:03:00.710692 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.710671 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.713482 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.713457 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 25 00:03:00.713593 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.713462 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d2vxp\"" Apr 25 00:03:00.720879 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.719502 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5c775"] Apr 25 00:03:00.722988 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.722963 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6gqhc"] Apr 25 00:03:00.726168 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.726148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:00.728478 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.728458 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 25 00:03:00.728764 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.728576 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-hzqzw\"" Apr 25 00:03:00.732986 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.732941 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6gqhc"] Apr 25 00:03:00.801080 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.801049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f125a41-7d58-42c0-86d8-9e89c5c6d9fd-cert\") pod \"odh-model-controller-696fc77849-6gqhc\" (UID: \"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd\") " pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:00.801237 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.801096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a65f09a5-572e-4db9-87ca-a10c2ae08d7e-tls-certs\") pod \"model-serving-api-86f7b4b499-5c775\" (UID: \"a65f09a5-572e-4db9-87ca-a10c2ae08d7e\") " pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.801280 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.801237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlsjn\" (UniqueName: \"kubernetes.io/projected/a65f09a5-572e-4db9-87ca-a10c2ae08d7e-kube-api-access-hlsjn\") pod \"model-serving-api-86f7b4b499-5c775\" (UID: \"a65f09a5-572e-4db9-87ca-a10c2ae08d7e\") " pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.801316 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.801306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj56g\" (UniqueName: \"kubernetes.io/projected/3f125a41-7d58-42c0-86d8-9e89c5c6d9fd-kube-api-access-vj56g\") pod \"odh-model-controller-696fc77849-6gqhc\" (UID: \"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd\") " pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:00.902100 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.902066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a65f09a5-572e-4db9-87ca-a10c2ae08d7e-tls-certs\") pod \"model-serving-api-86f7b4b499-5c775\" (UID: \"a65f09a5-572e-4db9-87ca-a10c2ae08d7e\") " pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.902592 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.902128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlsjn\" (UniqueName: \"kubernetes.io/projected/a65f09a5-572e-4db9-87ca-a10c2ae08d7e-kube-api-access-hlsjn\") pod \"model-serving-api-86f7b4b499-5c775\" (UID: \"a65f09a5-572e-4db9-87ca-a10c2ae08d7e\") " pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.902592 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.902155 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj56g\" (UniqueName: \"kubernetes.io/projected/3f125a41-7d58-42c0-86d8-9e89c5c6d9fd-kube-api-access-vj56g\") pod \"odh-model-controller-696fc77849-6gqhc\" (UID: \"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd\") " pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:00.902592 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.902184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f125a41-7d58-42c0-86d8-9e89c5c6d9fd-cert\") pod \"odh-model-controller-696fc77849-6gqhc\" (UID: \"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd\") " pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:00.904578 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.904553 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a65f09a5-572e-4db9-87ca-a10c2ae08d7e-tls-certs\") pod \"model-serving-api-86f7b4b499-5c775\" (UID: \"a65f09a5-572e-4db9-87ca-a10c2ae08d7e\") " pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.904578 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.904568 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f125a41-7d58-42c0-86d8-9e89c5c6d9fd-cert\") pod \"odh-model-controller-696fc77849-6gqhc\" (UID: \"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd\") " pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:00.910222 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.910190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlsjn\" (UniqueName: \"kubernetes.io/projected/a65f09a5-572e-4db9-87ca-a10c2ae08d7e-kube-api-access-hlsjn\") pod \"model-serving-api-86f7b4b499-5c775\" (UID: \"a65f09a5-572e-4db9-87ca-a10c2ae08d7e\") " pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:00.910321 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:00.910244 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj56g\" (UniqueName: \"kubernetes.io/projected/3f125a41-7d58-42c0-86d8-9e89c5c6d9fd-kube-api-access-vj56g\") pod \"odh-model-controller-696fc77849-6gqhc\" (UID: \"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd\") " pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:01.025038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:01.025010 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:01.039705 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:01.039682 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:01.154050 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:01.154022 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5c775"] Apr 25 00:03:01.156484 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:03:01.156455 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65f09a5_572e_4db9_87ca_a10c2ae08d7e.slice/crio-fa60794d972f8d164e4a29ca1bf934f78899f84d46e44d285274e87feb9be8f5 WatchSource:0}: Error finding container fa60794d972f8d164e4a29ca1bf934f78899f84d46e44d285274e87feb9be8f5: Status 404 returned error can't find the container with id fa60794d972f8d164e4a29ca1bf934f78899f84d46e44d285274e87feb9be8f5 Apr 25 00:03:01.177525 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:01.177500 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6gqhc"] Apr 25 00:03:01.178735 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:03:01.178707 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f125a41_7d58_42c0_86d8_9e89c5c6d9fd.slice/crio-032de148f76eb7f106e96deba4bd329f2a55be7160724dfce81803c4d79c2d2b WatchSource:0}: Error finding container 032de148f76eb7f106e96deba4bd329f2a55be7160724dfce81803c4d79c2d2b: Status 404 returned error can't find the container with id 032de148f76eb7f106e96deba4bd329f2a55be7160724dfce81803c4d79c2d2b Apr 25 00:03:01.978352 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:01.978304 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6gqhc" event={"ID":"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd","Type":"ContainerStarted","Data":"032de148f76eb7f106e96deba4bd329f2a55be7160724dfce81803c4d79c2d2b"} Apr 25 00:03:01.980189 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:01.980125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5c775" event={"ID":"a65f09a5-572e-4db9-87ca-a10c2ae08d7e","Type":"ContainerStarted","Data":"fa60794d972f8d164e4a29ca1bf934f78899f84d46e44d285274e87feb9be8f5"} Apr 25 00:03:03.988865 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:03.988770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6gqhc" event={"ID":"3f125a41-7d58-42c0-86d8-9e89c5c6d9fd","Type":"ContainerStarted","Data":"bb3608a797d33a073fd3d3ad8a150325ebbedadb25dd9987e3f22fbd53ec6725"} Apr 25 00:03:03.988865 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:03.988836 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:03.990066 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:03.990038 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5c775" event={"ID":"a65f09a5-572e-4db9-87ca-a10c2ae08d7e","Type":"ContainerStarted","Data":"bcf0cf9cd1c3f5fec7b58092b9b4bf7e7c38f3e89ccb94afd079ee669b39988f"} Apr 25 00:03:03.990158 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:03.990143 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:04.005818 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:04.005759 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6gqhc" podStartSLOduration=1.5507628580000001 podStartE2EDuration="4.005745341s" podCreationTimestamp="2026-04-25 00:03:00 +0000 UTC" firstStartedPulling="2026-04-25 00:03:01.17992529 +0000 UTC m=+559.784651865" lastFinishedPulling="2026-04-25 00:03:03.634907773 +0000 UTC m=+562.239634348" observedRunningTime="2026-04-25 00:03:04.003431504 +0000 UTC m=+562.608158103" watchObservedRunningTime="2026-04-25 00:03:04.005745341 +0000 UTC m=+562.610471939" Apr 25 00:03:04.022748 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:04.022679 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-5c775" podStartSLOduration=1.584871284 podStartE2EDuration="4.022663477s" podCreationTimestamp="2026-04-25 00:03:00 +0000 UTC" firstStartedPulling="2026-04-25 00:03:01.158442773 +0000 UTC m=+559.763169355" lastFinishedPulling="2026-04-25 00:03:03.596234961 +0000 UTC m=+562.200961548" observedRunningTime="2026-04-25 00:03:04.019325154 +0000 UTC m=+562.624051753" watchObservedRunningTime="2026-04-25 00:03:04.022663477 +0000 UTC m=+562.627390074" Apr 25 00:03:14.996282 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:14.996253 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6gqhc" Apr 25 00:03:14.997869 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:14.997849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-5c775" Apr 25 00:03:45.859470 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.859429 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4"] Apr 25 00:03:45.867638 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.867612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.870352 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.870328 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:03:45.870493 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.870472 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:03:45.871381 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.871239 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 25 00:03:45.871650 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.871621 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:03:45.874906 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.874876 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-model-cache\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.875055 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.874937 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4cf45f-202c-4246-845a-e67ddc65c337-tls-certs\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.875055 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.874965 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tw9p\" (UniqueName: \"kubernetes.io/projected/9d4cf45f-202c-4246-845a-e67ddc65c337-kube-api-access-2tw9p\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.875055 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.875043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-dshm\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.875572 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.875070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.875572 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.875112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-tmp-dir\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.875572 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.875138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-home\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.883543 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.883519 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4"] Apr 25 00:03:45.975659 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.975631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4cf45f-202c-4246-845a-e67ddc65c337-tls-certs\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.975851 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.975666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tw9p\" (UniqueName: \"kubernetes.io/projected/9d4cf45f-202c-4246-845a-e67ddc65c337-kube-api-access-2tw9p\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.975851 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.975718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-dshm\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.975851 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.975744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.976263 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.976208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-tmp-dir\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.976369 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.976352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-home\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.976963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.976512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-model-cache\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.976963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.976621 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.976963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.976785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-model-cache\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.977206 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.977181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-home\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.977636 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.977608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-tmp-dir\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.983556 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.979684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4cf45f-202c-4246-845a-e67ddc65c337-tls-certs\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.984339 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.984317 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-dshm\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:45.984871 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:45.984853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tw9p\" (UniqueName: \"kubernetes.io/projected/9d4cf45f-202c-4246-845a-e67ddc65c337-kube-api-access-2tw9p\") pod \"scheduler-inline-config-test-kserve-58fc84fd-f6zp4\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:46.180014 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:46.179935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:03:46.520866 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:46.520840 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4"] Apr 25 00:03:46.522670 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:03:46.522634 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4cf45f_202c_4246_845a_e67ddc65c337.slice/crio-a6ae2b311a4024e0b2d3e8b809453c999ff26c5ace6b2ef3379da94822fef8b1 WatchSource:0}: Error finding container a6ae2b311a4024e0b2d3e8b809453c999ff26c5ace6b2ef3379da94822fef8b1: Status 404 returned error can't find the container with id a6ae2b311a4024e0b2d3e8b809453c999ff26c5ace6b2ef3379da94822fef8b1 Apr 25 00:03:47.154611 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:47.154573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" event={"ID":"9d4cf45f-202c-4246-845a-e67ddc65c337","Type":"ContainerStarted","Data":"a6ae2b311a4024e0b2d3e8b809453c999ff26c5ace6b2ef3379da94822fef8b1"} Apr 25 00:03:50.168151 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:50.168112 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" event={"ID":"9d4cf45f-202c-4246-845a-e67ddc65c337","Type":"ContainerStarted","Data":"41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545"} Apr 25 00:03:55.188374 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:55.188336 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerID="41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545" exitCode=0 Apr 25 00:03:55.188768 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:55.188442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" event={"ID":"9d4cf45f-202c-4246-845a-e67ddc65c337","Type":"ContainerDied","Data":"41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545"} Apr 25 00:03:55.189699 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:55.189682 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:03:57.197075 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:57.197035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" event={"ID":"9d4cf45f-202c-4246-845a-e67ddc65c337","Type":"ContainerStarted","Data":"1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16"} Apr 25 00:03:57.214644 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:03:57.214593 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" podStartSLOduration=2.503291174 podStartE2EDuration="12.214578022s" podCreationTimestamp="2026-04-25 00:03:45 +0000 UTC" firstStartedPulling="2026-04-25 00:03:46.525195471 +0000 UTC m=+605.129922048" lastFinishedPulling="2026-04-25 00:03:56.236482306 +0000 UTC m=+614.841208896" observedRunningTime="2026-04-25 00:03:57.213918936 +0000 UTC m=+615.818645538" watchObservedRunningTime="2026-04-25 00:03:57.214578022 +0000 UTC m=+615.819304619" Apr 25 00:04:06.180287 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:04:06.180205 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:04:06.180287 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:04:06.180250 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:04:06.192678 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:04:06.192653 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:04:06.242524 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:04:06.242497 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:05:01.677907 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:01.677871 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4"] Apr 25 00:05:01.678715 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:01.678259 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerName="main" containerID="cri-o://1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16" gracePeriod=30 Apr 25 00:05:01.919554 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:01.919530 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:05:02.020424 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020374 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-dshm\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020430 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-home\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020461 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tw9p\" (UniqueName: \"kubernetes.io/projected/9d4cf45f-202c-4246-845a-e67ddc65c337-kube-api-access-2tw9p\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020490 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-model-cache\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020506 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4cf45f-202c-4246-845a-e67ddc65c337-tls-certs\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020538 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-tmp-dir\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020572 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-kserve-provision-location\") pod \"9d4cf45f-202c-4246-845a-e67ddc65c337\" (UID: \"9d4cf45f-202c-4246-845a-e67ddc65c337\") " Apr 25 00:05:02.020899 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020801 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-model-cache" (OuterVolumeSpecName: "model-cache") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.020899 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020812 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-home" (OuterVolumeSpecName: "home") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.021023 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.020890 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.022731 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.022690 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4cf45f-202c-4246-845a-e67ddc65c337-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:05:02.022858 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.022777 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-dshm" (OuterVolumeSpecName: "dshm") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.022858 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.022814 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4cf45f-202c-4246-845a-e67ddc65c337-kube-api-access-2tw9p" (OuterVolumeSpecName: "kube-api-access-2tw9p") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "kube-api-access-2tw9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:05:02.082283 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.082239 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d4cf45f-202c-4246-845a-e67ddc65c337" (UID: "9d4cf45f-202c-4246-845a-e67ddc65c337"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:02.122080 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122045 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.122080 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122080 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.122264 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122096 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.122264 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122111 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tw9p\" (UniqueName: \"kubernetes.io/projected/9d4cf45f-202c-4246-845a-e67ddc65c337-kube-api-access-2tw9p\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.122264 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122126 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.122264 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122140 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4cf45f-202c-4246-845a-e67ddc65c337-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.122264 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.122153 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d4cf45f-202c-4246-845a-e67ddc65c337-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:02.438684 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.438602 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerID="1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16" exitCode=0 Apr 25 00:05:02.438684 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.438670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" event={"ID":"9d4cf45f-202c-4246-845a-e67ddc65c337","Type":"ContainerDied","Data":"1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16"} Apr 25 00:05:02.438876 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.438698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" event={"ID":"9d4cf45f-202c-4246-845a-e67ddc65c337","Type":"ContainerDied","Data":"a6ae2b311a4024e0b2d3e8b809453c999ff26c5ace6b2ef3379da94822fef8b1"} Apr 25 00:05:02.438876 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.438714 2569 scope.go:117] "RemoveContainer" containerID="1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16" Apr 25 00:05:02.438876 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.438676 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4" Apr 25 00:05:02.450463 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.450442 2569 scope.go:117] "RemoveContainer" containerID="41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545" Apr 25 00:05:02.462918 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.462898 2569 scope.go:117] "RemoveContainer" containerID="1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16" Apr 25 00:05:02.463155 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:05:02.463135 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16\": container with ID starting with 1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16 not found: ID does not exist" containerID="1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16" Apr 25 00:05:02.463237 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.463168 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16"} err="failed to get container status \"1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16\": rpc error: code = NotFound desc = could not find container \"1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16\": container with ID starting with 1745260f2f81fa65346fcb5fe307cdcd43d7d60e3fef744dc93ba23698f79e16 not found: ID does not exist" Apr 25 00:05:02.463237 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.463193 2569 scope.go:117] "RemoveContainer" containerID="41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545" Apr 25 00:05:02.463487 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:05:02.463441 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545\": container with ID starting with 41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545 not found: ID does not exist" containerID="41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545" Apr 25 00:05:02.463487 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.463465 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545"} err="failed to get container status \"41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545\": rpc error: code = NotFound desc = could not find container \"41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545\": container with ID starting with 41f5eb7252908a8ee84f7ba47c3268e4c40e4c2fd7c8fa34210cc76e428e7545 not found: ID does not exist" Apr 25 00:05:02.468311 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.468287 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4"] Apr 25 00:05:02.470382 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:02.470363 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58fc84fd-f6zp4"] Apr 25 00:05:03.976775 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:03.976733 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" path="/var/lib/kubelet/pods/9d4cf45f-202c-4246-845a-e67ddc65c337/volumes" Apr 25 00:05:13.658546 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.658511 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr"] Apr 25 00:05:13.658968 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.658950 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerName="storage-initializer" Apr 25 00:05:13.659012 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.658972 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerName="storage-initializer" Apr 25 00:05:13.659012 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.659003 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerName="main" Apr 25 00:05:13.659101 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.659011 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerName="main" Apr 25 00:05:13.659138 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.659104 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d4cf45f-202c-4246-845a-e67ddc65c337" containerName="main" Apr 25 00:05:13.662542 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.662521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.665889 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.665706 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 25 00:05:13.665889 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.665770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:05:13.665889 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.665769 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:05:13.665889 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.665708 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:05:13.668866 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.668844 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr"] Apr 25 00:05:13.717575 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrgvg\" (UniqueName: \"kubernetes.io/projected/114b7738-d29f-4c32-a20c-ff170f51a30f-kube-api-access-qrgvg\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.717710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.717710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717642 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-dshm\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.717710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/114b7738-d29f-4c32-a20c-ff170f51a30f-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.717710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717684 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-home\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.717710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717699 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-model-cache\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.717890 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.717749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818162 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrgvg\" (UniqueName: \"kubernetes.io/projected/114b7738-d29f-4c32-a20c-ff170f51a30f-kube-api-access-qrgvg\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818315 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818315 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818221 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-dshm\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818315 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/114b7738-d29f-4c32-a20c-ff170f51a30f-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-home\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818432 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-model-cache\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818708 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818678 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818791 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-home\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818839 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.818839 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.818827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-model-cache\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.820554 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.820536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-dshm\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.820963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.820941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/114b7738-d29f-4c32-a20c-ff170f51a30f-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.825513 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.825495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrgvg\" (UniqueName: \"kubernetes.io/projected/114b7738-d29f-4c32-a20c-ff170f51a30f-kube-api-access-qrgvg\") pod \"scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:13.975367 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:13.975284 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:14.102591 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:14.102565 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr"] Apr 25 00:05:14.104055 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:05:14.104027 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114b7738_d29f_4c32_a20c_ff170f51a30f.slice/crio-7d3937c1eaaadb71d9618bba12b1ec93ef16035fce611f8b239707a52cce2e24 WatchSource:0}: Error finding container 7d3937c1eaaadb71d9618bba12b1ec93ef16035fce611f8b239707a52cce2e24: Status 404 returned error can't find the container with id 7d3937c1eaaadb71d9618bba12b1ec93ef16035fce611f8b239707a52cce2e24 Apr 25 00:05:14.483809 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:14.483774 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" event={"ID":"114b7738-d29f-4c32-a20c-ff170f51a30f","Type":"ContainerStarted","Data":"d1fc2857682db28251468cc3c9845c1c575f58801099c66a53d3bebdffda417b"} Apr 25 00:05:14.483809 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:14.483814 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" event={"ID":"114b7738-d29f-4c32-a20c-ff170f51a30f","Type":"ContainerStarted","Data":"7d3937c1eaaadb71d9618bba12b1ec93ef16035fce611f8b239707a52cce2e24"} Apr 25 00:05:18.500553 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:18.500522 2569 generic.go:358] "Generic (PLEG): container finished" podID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerID="d1fc2857682db28251468cc3c9845c1c575f58801099c66a53d3bebdffda417b" exitCode=0 Apr 25 00:05:18.501029 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:18.500554 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" event={"ID":"114b7738-d29f-4c32-a20c-ff170f51a30f","Type":"ContainerDied","Data":"d1fc2857682db28251468cc3c9845c1c575f58801099c66a53d3bebdffda417b"} Apr 25 00:05:19.505962 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:19.505927 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" event={"ID":"114b7738-d29f-4c32-a20c-ff170f51a30f","Type":"ContainerStarted","Data":"2839f4280917b8a19b1a2323aafe627ef571111884e0e3553b0b6170a4ae1c88"} Apr 25 00:05:19.524947 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:19.524900 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" podStartSLOduration=6.524885348 podStartE2EDuration="6.524885348s" podCreationTimestamp="2026-04-25 00:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:05:19.524489773 +0000 UTC m=+698.129216370" watchObservedRunningTime="2026-04-25 00:05:19.524885348 +0000 UTC m=+698.129611946" Apr 25 00:05:23.977037 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:23.977001 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:23.977037 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:23.977042 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:23.988276 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:23.988249 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:24.535006 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:24.534974 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:46.369269 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.369233 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr"] Apr 25 00:05:46.369719 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.369543 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerName="main" containerID="cri-o://2839f4280917b8a19b1a2323aafe627ef571111884e0e3553b0b6170a4ae1c88" gracePeriod=30 Apr 25 00:05:46.606362 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.606330 2569 generic.go:358] "Generic (PLEG): container finished" podID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerID="2839f4280917b8a19b1a2323aafe627ef571111884e0e3553b0b6170a4ae1c88" exitCode=0 Apr 25 00:05:46.606525 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.606436 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" event={"ID":"114b7738-d29f-4c32-a20c-ff170f51a30f","Type":"ContainerDied","Data":"2839f4280917b8a19b1a2323aafe627ef571111884e0e3553b0b6170a4ae1c88"} Apr 25 00:05:46.606525 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.606472 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" event={"ID":"114b7738-d29f-4c32-a20c-ff170f51a30f","Type":"ContainerDied","Data":"7d3937c1eaaadb71d9618bba12b1ec93ef16035fce611f8b239707a52cce2e24"} Apr 25 00:05:46.606525 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.606483 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3937c1eaaadb71d9618bba12b1ec93ef16035fce611f8b239707a52cce2e24" Apr 25 00:05:46.611147 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.611125 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:46.801769 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.801736 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-kserve-provision-location\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.801960 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.801782 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-dshm\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.801960 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.801801 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-tmp-dir\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.801960 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.801825 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrgvg\" (UniqueName: \"kubernetes.io/projected/114b7738-d29f-4c32-a20c-ff170f51a30f-kube-api-access-qrgvg\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.801960 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.801850 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-home\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.801960 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.801893 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-model-cache\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.802224 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.802075 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:46.802224 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.802086 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/114b7738-d29f-4c32-a20c-ff170f51a30f-tls-certs\") pod \"114b7738-d29f-4c32-a20c-ff170f51a30f\" (UID: \"114b7738-d29f-4c32-a20c-ff170f51a30f\") " Apr 25 00:05:46.802422 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.802376 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:46.802613 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.802585 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-model-cache" (OuterVolumeSpecName: "model-cache") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:46.802672 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.802607 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-home" (OuterVolumeSpecName: "home") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:46.804364 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.804340 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-dshm" (OuterVolumeSpecName: "dshm") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:46.804546 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.804514 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/114b7738-d29f-4c32-a20c-ff170f51a30f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:05:46.804649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.804608 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/114b7738-d29f-4c32-a20c-ff170f51a30f-kube-api-access-qrgvg" (OuterVolumeSpecName: "kube-api-access-qrgvg") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "kube-api-access-qrgvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:05:46.855735 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.855702 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "114b7738-d29f-4c32-a20c-ff170f51a30f" (UID: "114b7738-d29f-4c32-a20c-ff170f51a30f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:05:46.903459 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.903434 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrgvg\" (UniqueName: \"kubernetes.io/projected/114b7738-d29f-4c32-a20c-ff170f51a30f-kube-api-access-qrgvg\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:46.903459 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.903458 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:46.903612 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.903468 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:46.903612 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.903479 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/114b7738-d29f-4c32-a20c-ff170f51a30f-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:46.903612 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.903488 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:46.903612 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:46.903495 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/114b7738-d29f-4c32-a20c-ff170f51a30f-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:05:47.609671 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:47.609645 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr" Apr 25 00:05:47.631270 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:47.631239 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr"] Apr 25 00:05:47.635013 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:47.634987 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-64f57874fd-xm6kr"] Apr 25 00:05:47.977504 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:05:47.977398 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" path="/var/lib/kubelet/pods/114b7738-d29f-4c32-a20c-ff170f51a30f/volumes" Apr 25 00:06:01.364914 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.364884 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47"] Apr 25 00:06:01.365261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.365224 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerName="storage-initializer" Apr 25 00:06:01.365261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.365235 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerName="storage-initializer" Apr 25 00:06:01.365261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.365253 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerName="main" Apr 25 00:06:01.365261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.365258 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerName="main" Apr 25 00:06:01.365463 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.365312 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="114b7738-d29f-4c32-a20c-ff170f51a30f" containerName="main" Apr 25 00:06:01.369782 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.369763 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.372080 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.372057 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:06:01.373113 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.373096 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:06:01.373178 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.373129 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:06:01.373226 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.373129 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 25 00:06:01.377532 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.377513 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47"] Apr 25 00:06:01.404919 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.404891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-dshm\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.405038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.404928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.405038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.404962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.405038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.404978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6wr\" (UniqueName: \"kubernetes.io/projected/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kube-api-access-xr6wr\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.405038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.404995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.405038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.405014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.405206 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.405082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-home\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505494 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-dshm\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505592 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6wr\" (UniqueName: \"kubernetes.io/projected/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kube-api-access-xr6wr\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505842 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505842 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.505943 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-home\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.506020 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.505994 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.506073 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.506020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.506238 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.506218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.506275 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.506236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-home\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.507810 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.507783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-dshm\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.508333 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.508311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.512997 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.512976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6wr\" (UniqueName: \"kubernetes.io/projected/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kube-api-access-xr6wr\") pod \"scheduler-ha-replicas-test-kserve-84975459c8-whc47\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.639588 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.639492 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q"] Apr 25 00:06:01.643830 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.643809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.646090 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.646071 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-db5td\"" Apr 25 00:06:01.655055 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.655030 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q"] Apr 25 00:06:01.681038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.681009 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:06:01.707251 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.707220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.707433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.707255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.707433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.707273 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.707433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.707298 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.707433 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.707417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.707618 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.707475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbdk\" (UniqueName: \"kubernetes.io/projected/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kube-api-access-mpbdk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.803820 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.803780 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47"] Apr 25 00:06:01.807358 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:06:01.807321 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ade40a_660f_4590_a2b0_dc23c09e5fa2.slice/crio-3fe4de17daa0c23067d488eb00816d635a59f655f5ba0b2898312b2529b77319 WatchSource:0}: Error finding container 3fe4de17daa0c23067d488eb00816d635a59f655f5ba0b2898312b2529b77319: Status 404 returned error can't find the container with id 3fe4de17daa0c23067d488eb00816d635a59f655f5ba0b2898312b2529b77319 Apr 25 00:06:01.808506 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808602 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808524 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbdk\" (UniqueName: \"kubernetes.io/projected/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kube-api-access-mpbdk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808756 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808808 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808923 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.808923 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808904 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.809035 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.808993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.809207 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.809188 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.811195 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.811171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.815971 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.815951 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbdk\" (UniqueName: \"kubernetes.io/projected/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kube-api-access-mpbdk\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:01.955241 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:01.955158 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:02.090778 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:02.090749 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q"] Apr 25 00:06:02.091649 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:06:02.091617 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b7f2db_6bc4_4972_a574_ceba0e7d310a.slice/crio-f248d45a1e6b843d7c868bb1295127e53e7175d88fc26464cb92f11a3cc755a8 WatchSource:0}: Error finding container f248d45a1e6b843d7c868bb1295127e53e7175d88fc26464cb92f11a3cc755a8: Status 404 returned error can't find the container with id f248d45a1e6b843d7c868bb1295127e53e7175d88fc26464cb92f11a3cc755a8 Apr 25 00:06:02.671832 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:02.671798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" event={"ID":"18ade40a-660f-4590-a2b0-dc23c09e5fa2","Type":"ContainerStarted","Data":"47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76"} Apr 25 00:06:02.671832 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:02.671835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" event={"ID":"18ade40a-660f-4590-a2b0-dc23c09e5fa2","Type":"ContainerStarted","Data":"3fe4de17daa0c23067d488eb00816d635a59f655f5ba0b2898312b2529b77319"} Apr 25 00:06:02.673140 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:02.673121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerStarted","Data":"211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059"} Apr 25 00:06:02.673210 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:02.673145 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerStarted","Data":"f248d45a1e6b843d7c868bb1295127e53e7175d88fc26464cb92f11a3cc755a8"} Apr 25 00:06:03.678521 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:03.678481 2569 generic.go:358] "Generic (PLEG): container finished" podID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerID="211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059" exitCode=0 Apr 25 00:06:03.678920 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:03.678561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerDied","Data":"211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059"} Apr 25 00:06:05.690804 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:05.690753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerStarted","Data":"ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001"} Apr 25 00:06:35.827331 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:35.827294 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerStarted","Data":"00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0"} Apr 25 00:06:35.827808 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:35.827639 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:35.830359 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:35.830332 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:06:35.848198 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:35.848139 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podStartSLOduration=3.600891372 podStartE2EDuration="34.848121661s" podCreationTimestamp="2026-04-25 00:06:01 +0000 UTC" firstStartedPulling="2026-04-25 00:06:03.679709692 +0000 UTC m=+742.284436269" lastFinishedPulling="2026-04-25 00:06:34.926939972 +0000 UTC m=+773.531666558" observedRunningTime="2026-04-25 00:06:35.844797639 +0000 UTC m=+774.449524237" watchObservedRunningTime="2026-04-25 00:06:35.848121661 +0000 UTC m=+774.452848260" Apr 25 00:06:36.832715 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:36.832674 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:06:41.956226 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:41.956194 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:41.956823 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:41.956239 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:41.956823 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:41.956378 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.51:8082/healthz\": dial tcp 10.133.0.51:8082: connect: connection refused" Apr 25 00:06:41.957878 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:41.957853 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:06:51.957351 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:51.957317 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:51.957962 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:51.957712 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:06:51.958838 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:51.958813 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:06:51.958961 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:51.958840 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:06:52.891482 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:06:52.891450 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:02.892107 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:02.892019 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:12.891825 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:12.891785 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:22.891911 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:22.891873 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:32.891597 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:32.891560 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:42.892115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:42.892072 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:46.087489 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:46.087451 2569 generic.go:358] "Generic (PLEG): container finished" podID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerID="47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76" exitCode=0 Apr 25 00:07:46.087907 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:46.087524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" event={"ID":"18ade40a-660f-4590-a2b0-dc23c09e5fa2","Type":"ContainerDied","Data":"47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76"} Apr 25 00:07:47.092694 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:47.092660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" event={"ID":"18ade40a-660f-4590-a2b0-dc23c09e5fa2","Type":"ContainerStarted","Data":"02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9"} Apr 25 00:07:47.113419 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:47.113343 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" podStartSLOduration=106.113324874 podStartE2EDuration="1m46.113324874s" podCreationTimestamp="2026-04-25 00:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:07:47.110416508 +0000 UTC m=+845.715143098" watchObservedRunningTime="2026-04-25 00:07:47.113324874 +0000 UTC m=+845.718051473" Apr 25 00:07:51.681775 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:51.681738 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:07:51.682261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:51.681896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:07:51.694016 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:51.693989 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:07:52.121670 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:52.121643 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:07:52.851697 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:52.851659 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q"] Apr 25 00:07:52.852094 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:52.852047 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" containerID="cri-o://ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001" gracePeriod=30 Apr 25 00:07:52.852382 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:52.852348 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="tokenizer" containerID="cri-o://00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0" gracePeriod=30 Apr 25 00:07:52.853926 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:52.853809 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 25 00:07:52.860726 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:52.860700 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47"] Apr 25 00:07:53.117066 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:53.116969 2569 generic.go:358] "Generic (PLEG): container finished" podID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerID="ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001" exitCode=0 Apr 25 00:07:53.117066 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:53.117044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerDied","Data":"ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001"} Apr 25 00:07:53.154805 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:53.154776 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 25 00:07:53.154958 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:53.154843 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs podName:18ade40a-660f-4590-a2b0-dc23c09e5fa2 nodeName:}" failed. No retries permitted until 2026-04-25 00:07:53.654823929 +0000 UTC m=+852.259550509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs") pod "scheduler-ha-replicas-test-kserve-84975459c8-whc47" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 25 00:07:53.659206 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:53.659169 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 25 00:07:53.659395 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:53.659267 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs podName:18ade40a-660f-4590-a2b0-dc23c09e5fa2 nodeName:}" failed. No retries permitted until 2026-04-25 00:07:54.65924855 +0000 UTC m=+853.263975126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs") pod "scheduler-ha-replicas-test-kserve-84975459c8-whc47" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 25 00:07:54.099574 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.099550 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:07:54.130446 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.130397 2569 generic.go:358] "Generic (PLEG): container finished" podID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerID="00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0" exitCode=0 Apr 25 00:07:54.130446 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.130435 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerDied","Data":"00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0"} Apr 25 00:07:54.130661 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.130482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" event={"ID":"36b7f2db-6bc4-4972-a574-ceba0e7d310a","Type":"ContainerDied","Data":"f248d45a1e6b843d7c868bb1295127e53e7175d88fc26464cb92f11a3cc755a8"} Apr 25 00:07:54.130661 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.130503 2569 scope.go:117] "RemoveContainer" containerID="00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0" Apr 25 00:07:54.130661 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.130545 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q" Apr 25 00:07:54.131227 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.131145 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerName="main" containerID="cri-o://02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9" gracePeriod=30 Apr 25 00:07:54.140145 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.140129 2569 scope.go:117] "RemoveContainer" containerID="ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001" Apr 25 00:07:54.163379 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163359 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kserve-provision-location\") pod \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " Apr 25 00:07:54.163475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163432 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-tmp\") pod \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " Apr 25 00:07:54.163475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163464 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-cache\") pod \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " Apr 25 00:07:54.163544 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163487 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpbdk\" (UniqueName: \"kubernetes.io/projected/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kube-api-access-mpbdk\") pod \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " Apr 25 00:07:54.163544 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163529 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-uds\") pod \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " Apr 25 00:07:54.163616 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163578 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tls-certs\") pod \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\" (UID: \"36b7f2db-6bc4-4972-a574-ceba0e7d310a\") " Apr 25 00:07:54.163742 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163715 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "36b7f2db-6bc4-4972-a574-ceba0e7d310a" (UID: "36b7f2db-6bc4-4972-a574-ceba0e7d310a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.163827 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163805 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "36b7f2db-6bc4-4972-a574-ceba0e7d310a" (UID: "36b7f2db-6bc4-4972-a574-ceba0e7d310a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.163888 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163820 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "36b7f2db-6bc4-4972-a574-ceba0e7d310a" (UID: "36b7f2db-6bc4-4972-a574-ceba0e7d310a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.163945 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163907 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-tmp\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.163945 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163927 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.163945 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.163942 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tokenizer-uds\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.164112 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.164094 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "36b7f2db-6bc4-4972-a574-ceba0e7d310a" (UID: "36b7f2db-6bc4-4972-a574-ceba0e7d310a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.165581 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.165553 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "36b7f2db-6bc4-4972-a574-ceba0e7d310a" (UID: "36b7f2db-6bc4-4972-a574-ceba0e7d310a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:07:54.165581 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.165558 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kube-api-access-mpbdk" (OuterVolumeSpecName: "kube-api-access-mpbdk") pod "36b7f2db-6bc4-4972-a574-ceba0e7d310a" (UID: "36b7f2db-6bc4-4972-a574-ceba0e7d310a"). InnerVolumeSpecName "kube-api-access-mpbdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:07:54.242339 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.242315 2569 scope.go:117] "RemoveContainer" containerID="211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059" Apr 25 00:07:54.250596 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.250572 2569 scope.go:117] "RemoveContainer" containerID="00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0" Apr 25 00:07:54.250900 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:54.250880 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0\": container with ID starting with 00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0 not found: ID does not exist" containerID="00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0" Apr 25 00:07:54.250959 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.250910 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0"} err="failed to get container status \"00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0\": rpc error: code = NotFound desc = could not find container \"00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0\": container with ID starting with 00434ecc0394c591959ac2b82fb2b2eaf40dc55ecc9eb1ff36f7c79498fbfbb0 not found: ID does not exist" Apr 25 00:07:54.250959 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.250928 2569 scope.go:117] "RemoveContainer" containerID="ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001" Apr 25 00:07:54.251194 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:54.251178 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001\": container with ID starting with ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001 not found: ID does not exist" containerID="ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001" Apr 25 00:07:54.251248 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.251198 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001"} err="failed to get container status \"ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001\": rpc error: code = NotFound desc = could not find container \"ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001\": container with ID starting with ed9e2c01acebfd3490bd3def23a3a24724485513fbcf7adc45d473500f381001 not found: ID does not exist" Apr 25 00:07:54.251248 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.251211 2569 scope.go:117] "RemoveContainer" containerID="211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059" Apr 25 00:07:54.251466 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:54.251447 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059\": container with ID starting with 211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059 not found: ID does not exist" containerID="211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059" Apr 25 00:07:54.251524 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.251469 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059"} err="failed to get container status \"211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059\": rpc error: code = NotFound desc = could not find container \"211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059\": container with ID starting with 211d26694bc7405408ab1ed1f4f1e6f9a2b2ecf5ae7942314eed0709b54e7059 not found: ID does not exist" Apr 25 00:07:54.265038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.265015 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mpbdk\" (UniqueName: \"kubernetes.io/projected/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kube-api-access-mpbdk\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.265038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.265040 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36b7f2db-6bc4-4972-a574-ceba0e7d310a-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.265169 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.265051 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36b7f2db-6bc4-4972-a574-ceba0e7d310a-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.375863 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.375844 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:07:54.455677 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.455646 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q"] Apr 25 00:07:54.460694 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.460670 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-645c8c64c48q"] Apr 25 00:07:54.466386 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466368 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr6wr\" (UniqueName: \"kubernetes.io/projected/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kube-api-access-xr6wr\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466513 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466421 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kserve-provision-location\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466513 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466450 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-model-cache\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466513 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466498 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-home\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466686 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466528 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466686 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466567 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tmp-dir\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466686 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466596 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-dshm\") pod \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\" (UID: \"18ade40a-660f-4590-a2b0-dc23c09e5fa2\") " Apr 25 00:07:54.466686 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466649 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-model-cache" (OuterVolumeSpecName: "model-cache") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.466868 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466823 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.466868 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466843 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.466868 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.466831 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-home" (OuterVolumeSpecName: "home") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.468613 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.468588 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:07:54.468707 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.468588 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-dshm" (OuterVolumeSpecName: "dshm") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.468821 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.468806 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kube-api-access-xr6wr" (OuterVolumeSpecName: "kube-api-access-xr6wr") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "kube-api-access-xr6wr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:07:54.529953 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.529919 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "18ade40a-660f-4590-a2b0-dc23c09e5fa2" (UID: "18ade40a-660f-4590-a2b0-dc23c09e5fa2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:54.567526 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.567503 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.567526 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.567525 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.567649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.567536 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.567649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.567546 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.567649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.567554 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/18ade40a-660f-4590-a2b0-dc23c09e5fa2-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:54.567649 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:54.567562 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xr6wr\" (UniqueName: \"kubernetes.io/projected/18ade40a-660f-4590-a2b0-dc23c09e5fa2-kube-api-access-xr6wr\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:07:55.135155 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.135067 2569 generic.go:358] "Generic (PLEG): container finished" podID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerID="02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9" exitCode=0 Apr 25 00:07:55.135155 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.135143 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" Apr 25 00:07:55.135705 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.135147 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" event={"ID":"18ade40a-660f-4590-a2b0-dc23c09e5fa2","Type":"ContainerDied","Data":"02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9"} Apr 25 00:07:55.135705 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.135186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47" event={"ID":"18ade40a-660f-4590-a2b0-dc23c09e5fa2","Type":"ContainerDied","Data":"3fe4de17daa0c23067d488eb00816d635a59f655f5ba0b2898312b2529b77319"} Apr 25 00:07:55.135705 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.135201 2569 scope.go:117] "RemoveContainer" containerID="02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9" Apr 25 00:07:55.144594 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.144511 2569 scope.go:117] "RemoveContainer" containerID="47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76" Apr 25 00:07:55.154956 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.154935 2569 scope.go:117] "RemoveContainer" containerID="02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9" Apr 25 00:07:55.155489 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:55.155397 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9\": container with ID starting with 02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9 not found: ID does not exist" containerID="02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9" Apr 25 00:07:55.155489 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.155458 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9"} err="failed to get container status \"02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9\": rpc error: code = NotFound desc = could not find container \"02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9\": container with ID starting with 02e691964f0a8af625c205c34f4a9e609b7af069ec91a34c42760a1b013fece9 not found: ID does not exist" Apr 25 00:07:55.155489 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.155482 2569 scope.go:117] "RemoveContainer" containerID="47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76" Apr 25 00:07:55.156131 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:07:55.156103 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76\": container with ID starting with 47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76 not found: ID does not exist" containerID="47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76" Apr 25 00:07:55.156237 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.156130 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76"} err="failed to get container status \"47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76\": rpc error: code = NotFound desc = could not find container \"47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76\": container with ID starting with 47320475a2b7b9fc7fc0f8ff7156b8428fd435ec590256f6f93f2ac911430b76 not found: ID does not exist" Apr 25 00:07:55.158225 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.158203 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47"] Apr 25 00:07:55.161483 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.161465 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84975459c8-whc47"] Apr 25 00:07:55.977196 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.977165 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" path="/var/lib/kubelet/pods/18ade40a-660f-4590-a2b0-dc23c09e5fa2/volumes" Apr 25 00:07:55.977611 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:55.977598 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" path="/var/lib/kubelet/pods/36b7f2db-6bc4-4972-a574-ceba0e7d310a/volumes" Apr 25 00:07:58.143688 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.143652 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr"] Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144029 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144047 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144071 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerName="main" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144080 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerName="main" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144091 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerName="storage-initializer" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144100 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerName="storage-initializer" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144112 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="tokenizer" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144119 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="tokenizer" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144131 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="storage-initializer" Apr 25 00:07:58.144137 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144138 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="storage-initializer" Apr 25 00:07:58.144752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144261 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="tokenizer" Apr 25 00:07:58.144752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144274 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="18ade40a-660f-4590-a2b0-dc23c09e5fa2" containerName="main" Apr 25 00:07:58.144752 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.144285 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="36b7f2db-6bc4-4972-a574-ceba0e7d310a" containerName="main" Apr 25 00:07:58.149620 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.149600 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.151944 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.151925 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:07:58.152052 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.151958 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:07:58.152628 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.152605 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 25 00:07:58.152719 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.152641 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:07:58.156903 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.156883 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr"] Apr 25 00:07:58.195714 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-model-cache\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.195864 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.195864 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-tmp-dir\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.195986 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195859 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.195986 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpcs5\" (UniqueName: \"kubernetes.io/projected/5112a663-07dc-4f85-b04c-590fede83755-kube-api-access-dpcs5\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.195986 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195925 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-dshm\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.195986 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.195980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-home\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297173 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-model-cache\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297336 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-tmp-dir\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpcs5\" (UniqueName: \"kubernetes.io/projected/5112a663-07dc-4f85-b04c-590fede83755-kube-api-access-dpcs5\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297643 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-dshm\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297643 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-home\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.297746 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.297641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-model-cache\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.298255 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.298225 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-tmp-dir\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.298388 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.298307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.298388 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.298322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-home\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.300590 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.300560 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-dshm\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.305836 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.305810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.308505 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.308482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpcs5\" (UniqueName: \"kubernetes.io/projected/5112a663-07dc-4f85-b04c-590fede83755-kube-api-access-dpcs5\") pod \"precise-prefix-cache-test-kserve-74cfd44d4-xprjr\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.461205 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.461113 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:07:58.471185 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.471158 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79"] Apr 25 00:07:58.477170 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.476922 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.479430 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.479396 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-mpr54\"" Apr 25 00:07:58.486231 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.486206 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79"] Apr 25 00:07:58.499017 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.498976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.499142 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.499021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdg8q\" (UniqueName: \"kubernetes.io/projected/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kube-api-access-zdg8q\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.499229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.499139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.499229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.499179 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.499229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.499206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.499415 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.499292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.594844 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.594810 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr"] Apr 25 00:07:58.598323 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:07:58.598288 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5112a663_07dc_4f85_b04c_590fede83755.slice/crio-4da97050aff35c0ea37f4562b7adfc6a1ab92ccd5fe252479574eb712b05640a WatchSource:0}: Error finding container 4da97050aff35c0ea37f4562b7adfc6a1ab92ccd5fe252479574eb712b05640a: Status 404 returned error can't find the container with id 4da97050aff35c0ea37f4562b7adfc6a1ab92ccd5fe252479574eb712b05640a Apr 25 00:07:58.599799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.599773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.599912 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.599860 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.599912 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.599892 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdg8q\" (UniqueName: \"kubernetes.io/projected/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kube-api-access-zdg8q\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600029 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.599930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600029 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.599956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600029 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.599983 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600224 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.600201 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600293 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.600242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600293 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.600283 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.600429 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.600347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.602795 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.602769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.607492 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.607464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdg8q\" (UniqueName: \"kubernetes.io/projected/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kube-api-access-zdg8q\") pod \"precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.795005 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.794964 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:07:58.921945 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:58.921920 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79"] Apr 25 00:07:58.923678 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:07:58.923651 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af4f27b_8648_4b73_9417_ff0ac304c1aa.slice/crio-8ab3893adc67ef59bc7556714f39eb5bdf92febf5f4320927c68b7525ff3b9bc WatchSource:0}: Error finding container 8ab3893adc67ef59bc7556714f39eb5bdf92febf5f4320927c68b7525ff3b9bc: Status 404 returned error can't find the container with id 8ab3893adc67ef59bc7556714f39eb5bdf92febf5f4320927c68b7525ff3b9bc Apr 25 00:07:59.155546 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:59.155461 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerStarted","Data":"7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244"} Apr 25 00:07:59.155546 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:59.155504 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerStarted","Data":"8ab3893adc67ef59bc7556714f39eb5bdf92febf5f4320927c68b7525ff3b9bc"} Apr 25 00:07:59.157004 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:59.156974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" event={"ID":"5112a663-07dc-4f85-b04c-590fede83755","Type":"ContainerStarted","Data":"b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f"} Apr 25 00:07:59.157145 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:07:59.157008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" event={"ID":"5112a663-07dc-4f85-b04c-590fede83755","Type":"ContainerStarted","Data":"4da97050aff35c0ea37f4562b7adfc6a1ab92ccd5fe252479574eb712b05640a"} Apr 25 00:08:00.164617 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:00.164528 2569 generic.go:358] "Generic (PLEG): container finished" podID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerID="7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244" exitCode=0 Apr 25 00:08:00.164960 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:00.164645 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerDied","Data":"7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244"} Apr 25 00:08:01.170606 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:01.170570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerStarted","Data":"a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d"} Apr 25 00:08:01.170606 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:01.170611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerStarted","Data":"ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200"} Apr 25 00:08:01.171087 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:01.170643 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:08:01.197027 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:01.196946 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" podStartSLOduration=3.196923265 podStartE2EDuration="3.196923265s" podCreationTimestamp="2026-04-25 00:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:08:01.192484456 +0000 UTC m=+859.797211047" watchObservedRunningTime="2026-04-25 00:08:01.196923265 +0000 UTC m=+859.801649866" Apr 25 00:08:08.795302 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:08.795258 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:08:08.795707 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:08.795319 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:08:08.796646 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:08:08.796623 2569 logging.go:55] [core] [Channel #77 SubChannel #78]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.53:9003", ServerName: "10.133.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.53:9003: connect: connection refused" Apr 25 00:08:08.798003 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:08.797986 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:08:09.202802 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:09.202717 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:08:09.795558 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:09.795510 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.53:9003\" within 1s: context deadline exceeded" Apr 25 00:08:18.796467 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:08:18.796435 2569 logging.go:55] [core] [Channel #79 SubChannel #80]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.53:9003", ServerName: "10.133.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.53:9003: connect: connection refused" Apr 25 00:08:19.796468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:19.796424 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.53:9003\" within 1s: context deadline exceeded" Apr 25 00:08:30.207756 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:30.207725 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:08:56.396041 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:56.396004 2569 generic.go:358] "Generic (PLEG): container finished" podID="5112a663-07dc-4f85-b04c-590fede83755" containerID="b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f" exitCode=0 Apr 25 00:08:56.396521 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:56.396080 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" event={"ID":"5112a663-07dc-4f85-b04c-590fede83755","Type":"ContainerDied","Data":"b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f"} Apr 25 00:08:56.397158 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:56.397141 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:08:57.402179 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:57.402143 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" event={"ID":"5112a663-07dc-4f85-b04c-590fede83755","Type":"ContainerStarted","Data":"33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823"} Apr 25 00:08:57.421867 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:57.421822 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" podStartSLOduration=59.421807883 podStartE2EDuration="59.421807883s" podCreationTimestamp="2026-04-25 00:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:08:57.418782513 +0000 UTC m=+916.023509110" watchObservedRunningTime="2026-04-25 00:08:57.421807883 +0000 UTC m=+916.026534481" Apr 25 00:08:58.462060 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:58.462028 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:08:58.462060 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:58.462066 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:08:58.474663 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:58.474639 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:08:59.420591 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:08:59.420565 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:09:00.280034 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:00.279998 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79"] Apr 25 00:09:00.280556 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:00.280479 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="main" containerID="cri-o://ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200" gracePeriod=30 Apr 25 00:09:00.280556 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:00.280485 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="tokenizer" containerID="cri-o://a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d" gracePeriod=30 Apr 25 00:09:00.284986 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:00.284936 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr"] Apr 25 00:09:00.414963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:00.414932 2569 generic.go:358] "Generic (PLEG): container finished" podID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerID="ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200" exitCode=0 Apr 25 00:09:00.415140 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:00.415004 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerDied","Data":"ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200"} Apr 25 00:09:00.527576 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:00.527538 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/precise-prefix-cache-test-kserve-self-signed-certs: secret "precise-prefix-cache-test-kserve-self-signed-certs" not found Apr 25 00:09:00.527744 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:00.527630 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs podName:5112a663-07dc-4f85-b04c-590fede83755 nodeName:}" failed. No retries permitted until 2026-04-25 00:09:01.027604584 +0000 UTC m=+919.632331167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs") pod "precise-prefix-cache-test-kserve-74cfd44d4-xprjr" (UID: "5112a663-07dc-4f85-b04c-590fede83755") : secret "precise-prefix-cache-test-kserve-self-signed-certs" not found Apr 25 00:09:01.032618 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:01.032584 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/precise-prefix-cache-test-kserve-self-signed-certs: secret "precise-prefix-cache-test-kserve-self-signed-certs" not found Apr 25 00:09:01.032829 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:01.032682 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs podName:5112a663-07dc-4f85-b04c-590fede83755 nodeName:}" failed. No retries permitted until 2026-04-25 00:09:02.032665952 +0000 UTC m=+920.637392528 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs") pod "precise-prefix-cache-test-kserve-74cfd44d4-xprjr" (UID: "5112a663-07dc-4f85-b04c-590fede83755") : secret "precise-prefix-cache-test-kserve-self-signed-certs" not found Apr 25 00:09:01.419171 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.419064 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" podUID="5112a663-07dc-4f85-b04c-590fede83755" containerName="main" containerID="cri-o://33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823" gracePeriod=30 Apr 25 00:09:01.677818 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.677794 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:09:01.738991 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.738962 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739172 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739008 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-kserve-provision-location\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739172 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739039 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-home\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739172 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739085 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpcs5\" (UniqueName: \"kubernetes.io/projected/5112a663-07dc-4f85-b04c-590fede83755-kube-api-access-dpcs5\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739172 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739140 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-dshm\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739184 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-tmp-dir\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739225 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-model-cache\") pod \"5112a663-07dc-4f85-b04c-590fede83755\" (UID: \"5112a663-07dc-4f85-b04c-590fede83755\") " Apr 25 00:09:01.739468 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739380 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-home" (OuterVolumeSpecName: "home") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.739637 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739513 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.739637 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739546 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.739718 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.739646 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-model-cache" (OuterVolumeSpecName: "model-cache") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.741829 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.741803 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:01.742040 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.742016 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-dshm" (OuterVolumeSpecName: "dshm") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.742316 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.742293 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5112a663-07dc-4f85-b04c-590fede83755-kube-api-access-dpcs5" (OuterVolumeSpecName: "kube-api-access-dpcs5") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "kube-api-access-dpcs5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:01.755681 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.755661 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:09:01.798365 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.798314 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5112a663-07dc-4f85-b04c-590fede83755" (UID: "5112a663-07dc-4f85-b04c-590fede83755"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.839877 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.839854 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdg8q\" (UniqueName: \"kubernetes.io/projected/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kube-api-access-zdg8q\") pod \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " Apr 25 00:09:01.840058 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.839902 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-cache\") pod \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " Apr 25 00:09:01.840058 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.839927 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-uds\") pod \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " Apr 25 00:09:01.840058 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.839957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tls-certs\") pod \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " Apr 25 00:09:01.840058 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840034 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-tmp\") pod \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " Apr 25 00:09:01.840277 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840061 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kserve-provision-location\") pod \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\" (UID: \"9af4f27b-8648-4b73-9417-ff0ac304c1aa\") " Apr 25 00:09:01.840277 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840214 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9af4f27b-8648-4b73-9417-ff0ac304c1aa" (UID: "9af4f27b-8648-4b73-9417-ff0ac304c1aa"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840308 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840329 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840340 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840348 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-uds\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840356 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5112a663-07dc-4f85-b04c-590fede83755-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840257 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9af4f27b-8648-4b73-9417-ff0ac304c1aa" (UID: "9af4f27b-8648-4b73-9417-ff0ac304c1aa"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.840392 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840368 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5112a663-07dc-4f85-b04c-590fede83755-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840665 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840441 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpcs5\" (UniqueName: \"kubernetes.io/projected/5112a663-07dc-4f85-b04c-590fede83755-kube-api-access-dpcs5\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.840665 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840447 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9af4f27b-8648-4b73-9417-ff0ac304c1aa" (UID: "9af4f27b-8648-4b73-9417-ff0ac304c1aa"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.840820 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.840800 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9af4f27b-8648-4b73-9417-ff0ac304c1aa" (UID: "9af4f27b-8648-4b73-9417-ff0ac304c1aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:01.842095 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.842074 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kube-api-access-zdg8q" (OuterVolumeSpecName: "kube-api-access-zdg8q") pod "9af4f27b-8648-4b73-9417-ff0ac304c1aa" (UID: "9af4f27b-8648-4b73-9417-ff0ac304c1aa"). InnerVolumeSpecName "kube-api-access-zdg8q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:01.842168 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.842088 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9af4f27b-8648-4b73-9417-ff0ac304c1aa" (UID: "9af4f27b-8648-4b73-9417-ff0ac304c1aa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:01.941252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.941179 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-tmp\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.941252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.941206 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.941252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.941217 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdg8q\" (UniqueName: \"kubernetes.io/projected/9af4f27b-8648-4b73-9417-ff0ac304c1aa-kube-api-access-zdg8q\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.941252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.941226 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tokenizer-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:01.941252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:01.941237 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9af4f27b-8648-4b73-9417-ff0ac304c1aa-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:09:02.424271 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.424236 2569 generic.go:358] "Generic (PLEG): container finished" podID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerID="a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d" exitCode=0 Apr 25 00:09:02.424760 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.424306 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" Apr 25 00:09:02.424760 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.424319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerDied","Data":"a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d"} Apr 25 00:09:02.424760 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.424358 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79" event={"ID":"9af4f27b-8648-4b73-9417-ff0ac304c1aa","Type":"ContainerDied","Data":"8ab3893adc67ef59bc7556714f39eb5bdf92febf5f4320927c68b7525ff3b9bc"} Apr 25 00:09:02.424760 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.424374 2569 scope.go:117] "RemoveContainer" containerID="a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d" Apr 25 00:09:02.425936 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.425915 2569 generic.go:358] "Generic (PLEG): container finished" podID="5112a663-07dc-4f85-b04c-590fede83755" containerID="33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823" exitCode=0 Apr 25 00:09:02.426028 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.425985 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" Apr 25 00:09:02.426028 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.425994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" event={"ID":"5112a663-07dc-4f85-b04c-590fede83755","Type":"ContainerDied","Data":"33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823"} Apr 25 00:09:02.426115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.426027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr" event={"ID":"5112a663-07dc-4f85-b04c-590fede83755","Type":"ContainerDied","Data":"4da97050aff35c0ea37f4562b7adfc6a1ab92ccd5fe252479574eb712b05640a"} Apr 25 00:09:02.441939 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.441838 2569 scope.go:117] "RemoveContainer" containerID="ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200" Apr 25 00:09:02.450512 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.450492 2569 scope.go:117] "RemoveContainer" containerID="7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244" Apr 25 00:09:02.452833 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.452812 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79"] Apr 25 00:09:02.459006 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.458986 2569 scope.go:117] "RemoveContainer" containerID="a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d" Apr 25 00:09:02.459263 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:02.459245 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d\": container with ID starting with a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d not found: ID does not exist" containerID="a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d" Apr 25 00:09:02.459357 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.459275 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d"} err="failed to get container status \"a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d\": rpc error: code = NotFound desc = could not find container \"a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d\": container with ID starting with a24685c3339ca79e9b575b93039702a0053b09ec50cedc0e3a8923157a2de08d not found: ID does not exist" Apr 25 00:09:02.459357 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.459309 2569 scope.go:117] "RemoveContainer" containerID="ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200" Apr 25 00:09:02.459629 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:02.459611 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200\": container with ID starting with ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200 not found: ID does not exist" containerID="ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200" Apr 25 00:09:02.459687 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.459635 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200"} err="failed to get container status \"ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200\": rpc error: code = NotFound desc = could not find container \"ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200\": container with ID starting with ca79d5d730b52f35455a5bfa2464076fc3ec411363fe3e849bd7819a11113200 not found: ID does not exist" Apr 25 00:09:02.459687 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.459651 2569 scope.go:117] "RemoveContainer" containerID="7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244" Apr 25 00:09:02.459881 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:02.459867 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244\": container with ID starting with 7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244 not found: ID does not exist" containerID="7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244" Apr 25 00:09:02.459928 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.459884 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244"} err="failed to get container status \"7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244\": rpc error: code = NotFound desc = could not find container \"7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244\": container with ID starting with 7e534c07e62a8926cd114e860476e837f783d41d711afa58890acf259be97244 not found: ID does not exist" Apr 25 00:09:02.459928 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.459896 2569 scope.go:117] "RemoveContainer" containerID="33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823" Apr 25 00:09:02.460233 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.460203 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-745db564zwn79"] Apr 25 00:09:02.467936 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.467906 2569 scope.go:117] "RemoveContainer" containerID="b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f" Apr 25 00:09:02.471371 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.471348 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr"] Apr 25 00:09:02.475052 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.475030 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-74cfd44d4-xprjr"] Apr 25 00:09:02.532914 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.532894 2569 scope.go:117] "RemoveContainer" containerID="33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823" Apr 25 00:09:02.533215 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:02.533195 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823\": container with ID starting with 33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823 not found: ID does not exist" containerID="33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823" Apr 25 00:09:02.533295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.533224 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823"} err="failed to get container status \"33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823\": rpc error: code = NotFound desc = could not find container \"33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823\": container with ID starting with 33cdb354cb0cb2a940cb0ac38478d3aeb4dce1f2fe27b5f3bbd25f7562397823 not found: ID does not exist" Apr 25 00:09:02.533295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.533245 2569 scope.go:117] "RemoveContainer" containerID="b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f" Apr 25 00:09:02.533540 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:09:02.533520 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f\": container with ID starting with b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f not found: ID does not exist" containerID="b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f" Apr 25 00:09:02.533609 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:02.533548 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f"} err="failed to get container status \"b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f\": rpc error: code = NotFound desc = could not find container \"b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f\": container with ID starting with b9af5e5d53406950fb127f1f6034eb404f68be3716ef3f2d9a6f430df39c802f not found: ID does not exist" Apr 25 00:09:03.977744 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:03.977709 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5112a663-07dc-4f85-b04c-590fede83755" path="/var/lib/kubelet/pods/5112a663-07dc-4f85-b04c-590fede83755/volumes" Apr 25 00:09:03.978139 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:03.978126 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" path="/var/lib/kubelet/pods/9af4f27b-8648-4b73-9417-ff0ac304c1aa/volumes" Apr 25 00:09:04.675914 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.675885 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm"] Apr 25 00:09:04.676234 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676222 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="storage-initializer" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676237 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="storage-initializer" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676250 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5112a663-07dc-4f85-b04c-590fede83755" containerName="main" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676256 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5112a663-07dc-4f85-b04c-590fede83755" containerName="main" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676265 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="tokenizer" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676271 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="tokenizer" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676282 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5112a663-07dc-4f85-b04c-590fede83755" containerName="storage-initializer" Apr 25 00:09:04.676289 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676287 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5112a663-07dc-4f85-b04c-590fede83755" containerName="storage-initializer" Apr 25 00:09:04.676539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676297 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="main" Apr 25 00:09:04.676539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676302 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="main" Apr 25 00:09:04.676539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676351 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="tokenizer" Apr 25 00:09:04.676539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676361 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5112a663-07dc-4f85-b04c-590fede83755" containerName="main" Apr 25 00:09:04.676539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.676368 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9af4f27b-8648-4b73-9417-ff0ac304c1aa" containerName="main" Apr 25 00:09:04.681659 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.681615 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.683944 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.683920 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:09:04.684078 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.683920 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 25 00:09:04.684675 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.684655 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-nv6tb\"" Apr 25 00:09:04.684787 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.684706 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:09:04.684787 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.684775 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:09:04.689124 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.689099 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm"] Apr 25 00:09:04.765145 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.765107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.765325 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.765171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.765325 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.765198 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.765325 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.765223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.765325 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.765265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzdx\" (UniqueName: \"kubernetes.io/projected/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kube-api-access-7wzdx\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.765499 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.765340 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866058 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866058 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866308 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866308 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzdx\" (UniqueName: \"kubernetes.io/projected/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kube-api-access-7wzdx\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866308 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866308 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866204 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866556 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866522 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866599 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866676 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.866676 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.866606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.868784 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.868762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.873840 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.873816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzdx\" (UniqueName: \"kubernetes.io/projected/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kube-api-access-7wzdx\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:04.993574 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:04.993490 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:05.121952 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:05.121926 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm"] Apr 25 00:09:05.122888 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:09:05.122858 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a1d2fae_cb9e_4e07_a230_16c2149ba657.slice/crio-174cd7f539dead6af2c443de25e79f50247dfcdec0b2c35d72703775a7970082 WatchSource:0}: Error finding container 174cd7f539dead6af2c443de25e79f50247dfcdec0b2c35d72703775a7970082: Status 404 returned error can't find the container with id 174cd7f539dead6af2c443de25e79f50247dfcdec0b2c35d72703775a7970082 Apr 25 00:09:05.440009 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:05.439970 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerStarted","Data":"960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02"} Apr 25 00:09:05.440009 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:05.440014 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerStarted","Data":"174cd7f539dead6af2c443de25e79f50247dfcdec0b2c35d72703775a7970082"} Apr 25 00:09:06.444974 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:06.444939 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerID="960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02" exitCode=0 Apr 25 00:09:06.445385 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:06.445036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerDied","Data":"960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02"} Apr 25 00:09:07.450673 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:07.450634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerStarted","Data":"13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8"} Apr 25 00:09:07.450673 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:07.450673 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerStarted","Data":"b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa"} Apr 25 00:09:07.451082 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:07.450701 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:07.474848 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:07.474801 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" podStartSLOduration=3.474785674 podStartE2EDuration="3.474785674s" podCreationTimestamp="2026-04-25 00:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:09:07.472460499 +0000 UTC m=+926.077187099" watchObservedRunningTime="2026-04-25 00:09:07.474785674 +0000 UTC m=+926.079512277" Apr 25 00:09:14.994184 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:14.994139 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:14.994184 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:14.994193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:14.996949 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:14.996927 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:15.481685 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:15.481650 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:09:36.485746 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:09:36.485711 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:10:55.829123 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:55.829085 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm"] Apr 25 00:10:55.829666 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:55.829425 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="main" containerID="cri-o://b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa" gracePeriod=30 Apr 25 00:10:55.829666 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:55.829547 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="tokenizer" containerID="cri-o://13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8" gracePeriod=30 Apr 25 00:10:56.484388 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:10:56.484359 2569 logging.go:55] [core] [Channel #162 SubChannel #163]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.54:9003", ServerName: "10.133.0.54:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.54:9003: connect: connection refused" Apr 25 00:10:56.874394 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:56.874352 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerID="b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa" exitCode=0 Apr 25 00:10:56.874791 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:56.874442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerDied","Data":"b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa"} Apr 25 00:10:57.084710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.084688 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:10:57.208379 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208294 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-uds\") pod \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " Apr 25 00:10:57.208379 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208336 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kserve-provision-location\") pod \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " Apr 25 00:10:57.208379 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208356 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-tmp\") pod \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " Apr 25 00:10:57.208697 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208440 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-cache\") pod \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " Apr 25 00:10:57.208697 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208462 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wzdx\" (UniqueName: \"kubernetes.io/projected/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kube-api-access-7wzdx\") pod \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " Apr 25 00:10:57.208697 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208509 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tls-certs\") pod \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\" (UID: \"3a1d2fae-cb9e-4e07-a230-16c2149ba657\") " Apr 25 00:10:57.208697 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208648 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3a1d2fae-cb9e-4e07-a230-16c2149ba657" (UID: "3a1d2fae-cb9e-4e07-a230-16c2149ba657"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:10:57.208880 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208711 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3a1d2fae-cb9e-4e07-a230-16c2149ba657" (UID: "3a1d2fae-cb9e-4e07-a230-16c2149ba657"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:10:57.208880 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208758 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3a1d2fae-cb9e-4e07-a230-16c2149ba657" (UID: "3a1d2fae-cb9e-4e07-a230-16c2149ba657"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:10:57.208880 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.208782 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-uds\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:10:57.209127 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.209109 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3a1d2fae-cb9e-4e07-a230-16c2149ba657" (UID: "3a1d2fae-cb9e-4e07-a230-16c2149ba657"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:10:57.210698 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.210675 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3a1d2fae-cb9e-4e07-a230-16c2149ba657" (UID: "3a1d2fae-cb9e-4e07-a230-16c2149ba657"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:10:57.210802 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.210782 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kube-api-access-7wzdx" (OuterVolumeSpecName: "kube-api-access-7wzdx") pod "3a1d2fae-cb9e-4e07-a230-16c2149ba657" (UID: "3a1d2fae-cb9e-4e07-a230-16c2149ba657"). InnerVolumeSpecName "kube-api-access-7wzdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:10:57.309740 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.309703 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:10:57.309740 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.309732 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-tmp\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:10:57.309740 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.309744 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tokenizer-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:10:57.309968 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.309753 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wzdx\" (UniqueName: \"kubernetes.io/projected/3a1d2fae-cb9e-4e07-a230-16c2149ba657-kube-api-access-7wzdx\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:10:57.309968 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.309763 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a1d2fae-cb9e-4e07-a230-16c2149ba657-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:10:57.484724 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.484633 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.54:9003\" within 1s: context deadline exceeded" Apr 25 00:10:57.879529 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.879492 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerID="13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8" exitCode=0 Apr 25 00:10:57.879936 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.879574 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" Apr 25 00:10:57.879936 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.879572 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerDied","Data":"13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8"} Apr 25 00:10:57.879936 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.879610 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm" event={"ID":"3a1d2fae-cb9e-4e07-a230-16c2149ba657","Type":"ContainerDied","Data":"174cd7f539dead6af2c443de25e79f50247dfcdec0b2c35d72703775a7970082"} Apr 25 00:10:57.879936 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.879626 2569 scope.go:117] "RemoveContainer" containerID="13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8" Apr 25 00:10:57.888529 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.888509 2569 scope.go:117] "RemoveContainer" containerID="b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa" Apr 25 00:10:57.896014 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.895997 2569 scope.go:117] "RemoveContainer" containerID="960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02" Apr 25 00:10:57.902061 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.902020 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm"] Apr 25 00:10:57.904799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.904752 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-fmmpm"] Apr 25 00:10:57.904875 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.904824 2569 scope.go:117] "RemoveContainer" containerID="13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8" Apr 25 00:10:57.905097 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:10:57.905077 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8\": container with ID starting with 13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8 not found: ID does not exist" containerID="13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8" Apr 25 00:10:57.905146 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.905105 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8"} err="failed to get container status \"13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8\": rpc error: code = NotFound desc = could not find container \"13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8\": container with ID starting with 13d85176daee3c772b3f7d59eae928a5b9326a363f1381716d23c79ca08d96f8 not found: ID does not exist" Apr 25 00:10:57.905146 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.905123 2569 scope.go:117] "RemoveContainer" containerID="b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa" Apr 25 00:10:57.905367 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:10:57.905347 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa\": container with ID starting with b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa not found: ID does not exist" containerID="b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa" Apr 25 00:10:57.905477 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.905371 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa"} err="failed to get container status \"b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa\": rpc error: code = NotFound desc = could not find container \"b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa\": container with ID starting with b566a71f887fbd2088938d1fad21abaf1df11c23e7535a6b9d3a9ebf02fbf2aa not found: ID does not exist" Apr 25 00:10:57.905477 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.905386 2569 scope.go:117] "RemoveContainer" containerID="960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02" Apr 25 00:10:57.905631 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:10:57.905616 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02\": container with ID starting with 960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02 not found: ID does not exist" containerID="960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02" Apr 25 00:10:57.905672 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.905636 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02"} err="failed to get container status \"960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02\": rpc error: code = NotFound desc = could not find container \"960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02\": container with ID starting with 960cbeb575957e0f4ea9e50f6ab17b5df87a4c3f13a658033f126d093c4e8c02 not found: ID does not exist" Apr 25 00:10:57.977755 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:10:57.977727 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" path="/var/lib/kubelet/pods/3a1d2fae-cb9e-4e07-a230-16c2149ba657/volumes" Apr 25 00:11:18.939651 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.939568 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2"] Apr 25 00:11:18.939995 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.939969 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="storage-initializer" Apr 25 00:11:18.939995 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.939981 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="storage-initializer" Apr 25 00:11:18.939995 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.939992 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="tokenizer" Apr 25 00:11:18.940166 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.939999 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="tokenizer" Apr 25 00:11:18.940166 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.940035 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="main" Apr 25 00:11:18.940166 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.940045 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="main" Apr 25 00:11:18.940166 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.940112 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="tokenizer" Apr 25 00:11:18.940166 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.940126 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a1d2fae-cb9e-4e07-a230-16c2149ba657" containerName="main" Apr 25 00:11:18.943594 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.943576 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:18.946126 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.946099 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:11:18.946257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.946152 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4dwzn\"" Apr 25 00:11:18.946257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.946249 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:11:18.946380 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.946249 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 25 00:11:18.947014 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.946985 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-ws2w5\"" Apr 25 00:11:18.953886 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:18.953863 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2"] Apr 25 00:11:19.087440 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.087383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.087622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.087448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.087622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.087497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.087622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.087517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8941e9d-c1ba-4e54-a62b-492705d58230-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.087622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.087540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.087622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.087570 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94r49\" (UniqueName: \"kubernetes.io/projected/b8941e9d-c1ba-4e54-a62b-492705d58230-kube-api-access-94r49\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.188958 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.188927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189148 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.188973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94r49\" (UniqueName: \"kubernetes.io/projected/b8941e9d-c1ba-4e54-a62b-492705d58230-kube-api-access-94r49\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189148 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189148 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189148 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189148 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189136 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8941e9d-c1ba-4e54-a62b-492705d58230-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189455 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189523 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189560 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.189560 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.189536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b8941e9d-c1ba-4e54-a62b-492705d58230-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.191624 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.191567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b8941e9d-c1ba-4e54-a62b-492705d58230-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.196153 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.196135 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94r49\" (UniqueName: \"kubernetes.io/projected/b8941e9d-c1ba-4e54-a62b-492705d58230-kube-api-access-94r49\") pod \"stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2\" (UID: \"b8941e9d-c1ba-4e54-a62b-492705d58230\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.254929 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.254901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:19.384924 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.384892 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2"] Apr 25 00:11:19.386427 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:11:19.386386 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8941e9d_c1ba_4e54_a62b_492705d58230.slice/crio-78d30be3d8ea6a1ab5d479cd864819ad35c066c9ef8c198b296acdfc78d5c18d WatchSource:0}: Error finding container 78d30be3d8ea6a1ab5d479cd864819ad35c066c9ef8c198b296acdfc78d5c18d: Status 404 returned error can't find the container with id 78d30be3d8ea6a1ab5d479cd864819ad35c066c9ef8c198b296acdfc78d5c18d Apr 25 00:11:19.965265 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.965228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" event={"ID":"b8941e9d-c1ba-4e54-a62b-492705d58230","Type":"ContainerStarted","Data":"079bd8d55b3245e2ff33515342d2489d5aaa3738e6f62ce4f8a90e90539bd90b"} Apr 25 00:11:19.965265 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:19.965271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" event={"ID":"b8941e9d-c1ba-4e54-a62b-492705d58230","Type":"ContainerStarted","Data":"78d30be3d8ea6a1ab5d479cd864819ad35c066c9ef8c198b296acdfc78d5c18d"} Apr 25 00:11:20.969695 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:20.969656 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8941e9d-c1ba-4e54-a62b-492705d58230" containerID="079bd8d55b3245e2ff33515342d2489d5aaa3738e6f62ce4f8a90e90539bd90b" exitCode=0 Apr 25 00:11:20.970075 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:20.969746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" event={"ID":"b8941e9d-c1ba-4e54-a62b-492705d58230","Type":"ContainerDied","Data":"079bd8d55b3245e2ff33515342d2489d5aaa3738e6f62ce4f8a90e90539bd90b"} Apr 25 00:11:21.978625 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:21.978601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:21.978990 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:21.978629 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" event={"ID":"b8941e9d-c1ba-4e54-a62b-492705d58230","Type":"ContainerStarted","Data":"9189e62b14809fa707a9eb9406f0aa63ee54bd6fd6f3b84132ed81f867e97301"} Apr 25 00:11:21.978990 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:21.978643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" event={"ID":"b8941e9d-c1ba-4e54-a62b-492705d58230","Type":"ContainerStarted","Data":"e888752b865e919709b70aca054eec1e9b3321711eeeaadc8e326a87270653a7"} Apr 25 00:11:21.996805 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:21.996757 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" podStartSLOduration=3.996741378 podStartE2EDuration="3.996741378s" podCreationTimestamp="2026-04-25 00:11:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:11:21.99467805 +0000 UTC m=+1060.599404649" watchObservedRunningTime="2026-04-25 00:11:21.996741378 +0000 UTC m=+1060.601467989" Apr 25 00:11:29.255550 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:29.255514 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:29.255550 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:29.255554 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:29.258225 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:29.258202 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:30.006591 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:30.006561 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:11:41.983839 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:41.983770 2569 scope.go:117] "RemoveContainer" containerID="2839f4280917b8a19b1a2323aafe627ef571111884e0e3553b0b6170a4ae1c88" Apr 25 00:11:41.992527 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:41.992500 2569 scope.go:117] "RemoveContainer" containerID="d1fc2857682db28251468cc3c9845c1c575f58801099c66a53d3bebdffda417b" Apr 25 00:11:51.010150 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:11:51.010116 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2" Apr 25 00:26:00.747212 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.747173 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5d945688d8-zcv7x"] Apr 25 00:26:00.750851 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.750829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:00.758081 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.758060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mhvk\" (UniqueName: \"kubernetes.io/projected/5cdea8df-86df-4d4f-81b2-d9d6df6c04d2-kube-api-access-9mhvk\") pod \"llmisvc-controller-manager-5d945688d8-zcv7x\" (UID: \"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2\") " pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:00.758160 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.758104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdea8df-86df-4d4f-81b2-d9d6df6c04d2-cert\") pod \"llmisvc-controller-manager-5d945688d8-zcv7x\" (UID: \"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2\") " pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:00.764017 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.763988 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5d945688d8-zcv7x"] Apr 25 00:26:00.858934 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.858897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mhvk\" (UniqueName: \"kubernetes.io/projected/5cdea8df-86df-4d4f-81b2-d9d6df6c04d2-kube-api-access-9mhvk\") pod \"llmisvc-controller-manager-5d945688d8-zcv7x\" (UID: \"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2\") " pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:00.859089 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.858946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdea8df-86df-4d4f-81b2-d9d6df6c04d2-cert\") pod \"llmisvc-controller-manager-5d945688d8-zcv7x\" (UID: \"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2\") " pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:00.861252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.861226 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cdea8df-86df-4d4f-81b2-d9d6df6c04d2-cert\") pod \"llmisvc-controller-manager-5d945688d8-zcv7x\" (UID: \"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2\") " pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:00.867673 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:00.867636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mhvk\" (UniqueName: \"kubernetes.io/projected/5cdea8df-86df-4d4f-81b2-d9d6df6c04d2-kube-api-access-9mhvk\") pod \"llmisvc-controller-manager-5d945688d8-zcv7x\" (UID: \"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2\") " pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:01.060571 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:01.060544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:01.182557 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:01.182527 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5d945688d8-zcv7x"] Apr 25 00:26:01.183908 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:26:01.183883 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5cdea8df_86df_4d4f_81b2_d9d6df6c04d2.slice/crio-ca2f8562df86876fba971561aa9e856cace83596cd2a75c51bf0f26c048bd86e WatchSource:0}: Error finding container ca2f8562df86876fba971561aa9e856cace83596cd2a75c51bf0f26c048bd86e: Status 404 returned error can't find the container with id ca2f8562df86876fba971561aa9e856cace83596cd2a75c51bf0f26c048bd86e Apr 25 00:26:01.185584 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:01.185565 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:26:01.243575 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:01.243546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" event={"ID":"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2","Type":"ContainerStarted","Data":"ca2f8562df86876fba971561aa9e856cace83596cd2a75c51bf0f26c048bd86e"} Apr 25 00:26:02.248827 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:02.248774 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" event={"ID":"5cdea8df-86df-4d4f-81b2-d9d6df6c04d2","Type":"ContainerStarted","Data":"e41a9530c956e6c7bc7657098577c28b17d266e87a3fbb89e224c77d94c1527c"} Apr 25 00:26:02.249229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:02.248888 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:02.265718 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:02.265675 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" podStartSLOduration=1.760021084 podStartE2EDuration="2.265661s" podCreationTimestamp="2026-04-25 00:26:00 +0000 UTC" firstStartedPulling="2026-04-25 00:26:01.185724587 +0000 UTC m=+1939.790451163" lastFinishedPulling="2026-04-25 00:26:01.691364503 +0000 UTC m=+1940.296091079" observedRunningTime="2026-04-25 00:26:02.264094497 +0000 UTC m=+1940.868821092" watchObservedRunningTime="2026-04-25 00:26:02.265661 +0000 UTC m=+1940.870387598" Apr 25 00:26:33.254122 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.254047 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5d945688d8-zcv7x" Apr 25 00:26:33.295483 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.295449 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7689784d4c-wg8df"] Apr 25 00:26:33.295759 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.295732 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" podUID="599c0f54-e37c-4031-bd43-1e7c027ea21b" containerName="manager" containerID="cri-o://eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9" gracePeriod=30 Apr 25 00:26:33.542757 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.542735 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:26:33.627423 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.627369 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5g8g\" (UniqueName: \"kubernetes.io/projected/599c0f54-e37c-4031-bd43-1e7c027ea21b-kube-api-access-m5g8g\") pod \"599c0f54-e37c-4031-bd43-1e7c027ea21b\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " Apr 25 00:26:33.627603 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.627483 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/599c0f54-e37c-4031-bd43-1e7c027ea21b-cert\") pod \"599c0f54-e37c-4031-bd43-1e7c027ea21b\" (UID: \"599c0f54-e37c-4031-bd43-1e7c027ea21b\") " Apr 25 00:26:33.629424 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.629387 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599c0f54-e37c-4031-bd43-1e7c027ea21b-kube-api-access-m5g8g" (OuterVolumeSpecName: "kube-api-access-m5g8g") pod "599c0f54-e37c-4031-bd43-1e7c027ea21b" (UID: "599c0f54-e37c-4031-bd43-1e7c027ea21b"). InnerVolumeSpecName "kube-api-access-m5g8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:26:33.629551 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.629533 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599c0f54-e37c-4031-bd43-1e7c027ea21b-cert" (OuterVolumeSpecName: "cert") pod "599c0f54-e37c-4031-bd43-1e7c027ea21b" (UID: "599c0f54-e37c-4031-bd43-1e7c027ea21b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:26:33.728394 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.728360 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/599c0f54-e37c-4031-bd43-1e7c027ea21b-cert\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:26:33.728394 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:33.728390 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5g8g\" (UniqueName: \"kubernetes.io/projected/599c0f54-e37c-4031-bd43-1e7c027ea21b-kube-api-access-m5g8g\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:26:34.373694 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.373653 2569 generic.go:358] "Generic (PLEG): container finished" podID="599c0f54-e37c-4031-bd43-1e7c027ea21b" containerID="eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9" exitCode=0 Apr 25 00:26:34.374102 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.373717 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" Apr 25 00:26:34.374102 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.373739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" event={"ID":"599c0f54-e37c-4031-bd43-1e7c027ea21b","Type":"ContainerDied","Data":"eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9"} Apr 25 00:26:34.374102 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.373780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7689784d4c-wg8df" event={"ID":"599c0f54-e37c-4031-bd43-1e7c027ea21b","Type":"ContainerDied","Data":"a451483371b9fb7bd33b76276a18de6b33847111d07d706f0ff790142f574460"} Apr 25 00:26:34.374102 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.373819 2569 scope.go:117] "RemoveContainer" containerID="eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9" Apr 25 00:26:34.382937 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.382920 2569 scope.go:117] "RemoveContainer" containerID="eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9" Apr 25 00:26:34.383162 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:26:34.383144 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9\": container with ID starting with eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9 not found: ID does not exist" containerID="eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9" Apr 25 00:26:34.383211 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.383173 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9"} err="failed to get container status \"eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9\": rpc error: code = NotFound desc = could not find container \"eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9\": container with ID starting with eaa1e61cfb73aee0c0f432e1d9424e2b7cf07f7fc7270545fb107240d1f57ca9 not found: ID does not exist" Apr 25 00:26:34.395564 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.395535 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7689784d4c-wg8df"] Apr 25 00:26:34.395898 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:34.395864 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-7689784d4c-wg8df"] Apr 25 00:26:35.977793 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:26:35.977761 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599c0f54-e37c-4031-bd43-1e7c027ea21b" path="/var/lib/kubelet/pods/599c0f54-e37c-4031-bd43-1e7c027ea21b/volumes" Apr 25 00:30:56.550393 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.550357 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 25 00:30:56.551305 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.551277 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="599c0f54-e37c-4031-bd43-1e7c027ea21b" containerName="manager" Apr 25 00:30:56.551305 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.551305 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="599c0f54-e37c-4031-bd43-1e7c027ea21b" containerName="manager" Apr 25 00:30:56.551598 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.551390 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="599c0f54-e37c-4031-bd43-1e7c027ea21b" containerName="manager" Apr 25 00:30:56.554726 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.554706 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.558278 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.558081 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 25 00:30:56.561600 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.561575 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-bc7j6\"" Apr 25 00:30:56.567389 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.567368 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 25 00:30:56.682917 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.682880 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.683115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.682928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mk8g\" (UniqueName: \"kubernetes.io/projected/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kube-api-access-4mk8g\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.683115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.682956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.683115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.683040 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.683115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.683091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.683279 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.683147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.683279 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.683208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784523 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784716 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mk8g\" (UniqueName: \"kubernetes.io/projected/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kube-api-access-4mk8g\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784716 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784716 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784716 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784938 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784938 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784808 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.784938 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784886 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.785093 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.784991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.785093 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.785040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.785093 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.785089 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.786859 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.786836 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.787081 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.787062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.792321 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.792304 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mk8g\" (UniqueName: \"kubernetes.io/projected/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kube-api-access-4mk8g\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:56.866254 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:56.866170 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:30:57.002223 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:57.002188 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 25 00:30:57.003014 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:30:57.002977 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f1f9e22_5f86_4ea2_aae9_f1dd0a1c9ce7.slice/crio-5b460758593f349a7646f6cff7ab71f8a5c14850aea27082b2481d10696b403c WatchSource:0}: Error finding container 5b460758593f349a7646f6cff7ab71f8a5c14850aea27082b2481d10696b403c: Status 404 returned error can't find the container with id 5b460758593f349a7646f6cff7ab71f8a5c14850aea27082b2481d10696b403c Apr 25 00:30:57.348110 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:57.348073 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7","Type":"ContainerStarted","Data":"b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94"} Apr 25 00:30:57.348110 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:30:57.348112 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7","Type":"ContainerStarted","Data":"5b460758593f349a7646f6cff7ab71f8a5c14850aea27082b2481d10696b403c"} Apr 25 00:31:01.367143 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:31:01.367110 2569 generic.go:358] "Generic (PLEG): container finished" podID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerID="b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94" exitCode=0 Apr 25 00:31:01.367523 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:31:01.367171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7","Type":"ContainerDied","Data":"b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94"} Apr 25 00:31:01.368311 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:31:01.368293 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:31:45.544205 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:31:45.544168 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7","Type":"ContainerStarted","Data":"bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f"} Apr 25 00:31:45.562679 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:31:45.562627 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.012565195 podStartE2EDuration="49.5626121s" podCreationTimestamp="2026-04-25 00:30:56 +0000 UTC" firstStartedPulling="2026-04-25 00:31:01.368434537 +0000 UTC m=+2239.973161113" lastFinishedPulling="2026-04-25 00:31:44.918481431 +0000 UTC m=+2283.523208018" observedRunningTime="2026-04-25 00:31:45.560552272 +0000 UTC m=+2284.165278869" watchObservedRunningTime="2026-04-25 00:31:45.5626121 +0000 UTC m=+2284.167338737" Apr 25 00:35:05.068846 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.068810 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r"] Apr 25 00:35:05.072685 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.072662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.075059 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.075038 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 25 00:35:05.075172 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.075117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-gvl9t\"" Apr 25 00:35:05.083174 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.083148 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r"] Apr 25 00:35:05.119744 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.119710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.119924 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.119773 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.119924 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.119824 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.119924 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.119894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.120086 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.119963 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.120086 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.120066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g2b\" (UniqueName: \"kubernetes.io/projected/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-kube-api-access-25g2b\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.120197 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.120116 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.120197 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.120180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.120290 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.120234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221689 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221655 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25g2b\" (UniqueName: \"kubernetes.io/projected/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-kube-api-access-25g2b\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.221870 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.222332 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.221915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.222332 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.222184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.222623 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.222596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.222842 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.222786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.222842 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.222786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.223059 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.223035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.224531 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.224510 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.224918 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.224899 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.230861 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.230834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g2b\" (UniqueName: \"kubernetes.io/projected/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-kube-api-access-25g2b\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.231229 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.231208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2f5d2ff8-3f89-412b-98e7-44da88df0a7a-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-9ft6r\" (UID: \"2f5d2ff8-3f89-412b-98e7-44da88df0a7a\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.385860 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.385778 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:05.719842 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:05.719813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r"] Apr 25 00:35:05.721479 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:35:05.721452 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f5d2ff8_3f89_412b_98e7_44da88df0a7a.slice/crio-55b39fedfbc9eac37596b917ab7bfb859c15e6f0ab5d4c6c120ea7efb886c407 WatchSource:0}: Error finding container 55b39fedfbc9eac37596b917ab7bfb859c15e6f0ab5d4c6c120ea7efb886c407: Status 404 returned error can't find the container with id 55b39fedfbc9eac37596b917ab7bfb859c15e6f0ab5d4c6c120ea7efb886c407 Apr 25 00:35:06.315380 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:06.315331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" event={"ID":"2f5d2ff8-3f89-412b-98e7-44da88df0a7a","Type":"ContainerStarted","Data":"55b39fedfbc9eac37596b917ab7bfb859c15e6f0ab5d4c6c120ea7efb886c407"} Apr 25 00:35:08.641158 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:08.641083 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 25 00:35:08.641465 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:08.641216 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 25 00:35:08.641465 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:08.641252 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 25 00:35:09.332109 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:09.332068 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" event={"ID":"2f5d2ff8-3f89-412b-98e7-44da88df0a7a","Type":"ContainerStarted","Data":"04a5271e4716d8917a0a0e38423d4fb2be0234e8821fd9fe04aee02b577f7eaf"} Apr 25 00:35:09.355909 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:09.355860 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" podStartSLOduration=1.438792894 podStartE2EDuration="4.355845669s" podCreationTimestamp="2026-04-25 00:35:05 +0000 UTC" firstStartedPulling="2026-04-25 00:35:05.723760906 +0000 UTC m=+2484.328487482" lastFinishedPulling="2026-04-25 00:35:08.640813678 +0000 UTC m=+2487.245540257" observedRunningTime="2026-04-25 00:35:09.354180868 +0000 UTC m=+2487.958907485" watchObservedRunningTime="2026-04-25 00:35:09.355845669 +0000 UTC m=+2487.960572266" Apr 25 00:35:09.386227 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:09.386197 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:09.387323 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:09.387296 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" podUID="2f5d2ff8-3f89-412b-98e7-44da88df0a7a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.58:15021/healthz/ready\": dial tcp 10.133.0.58:15021: connect: connection refused" Apr 25 00:35:10.386262 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:10.386223 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" podUID="2f5d2ff8-3f89-412b-98e7-44da88df0a7a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.58:15021/healthz/ready\": dial tcp 10.133.0.58:15021: connect: connection refused" Apr 25 00:35:11.386345 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:11.386307 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" podUID="2f5d2ff8-3f89-412b-98e7-44da88df0a7a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.58:15021/healthz/ready\": dial tcp 10.133.0.58:15021: connect: connection refused" Apr 25 00:35:12.390516 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:12.390488 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:12.390885 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:12.390753 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:12.391381 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:12.391364 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-9ft6r" Apr 25 00:35:14.438704 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.438662 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4"] Apr 25 00:35:14.442776 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.442752 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.445011 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.444990 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-9ht2q\"" Apr 25 00:35:14.445116 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.445035 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 25 00:35:14.451587 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.451566 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4"] Apr 25 00:35:14.506896 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.506864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-dshm\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.506896 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.506897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-model-cache\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.507115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.506919 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jvq\" (UniqueName: \"kubernetes.io/projected/a2e018f2-61d4-416d-8518-267ad15b615f-kube-api-access-92jvq\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.507115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.506969 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.507115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.507031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-tmp-dir\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.507115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.507064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2e018f2-61d4-416d-8518-267ad15b615f-tls-certs\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.507282 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.507154 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-home\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608306 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-dshm\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608306 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-model-cache\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608559 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92jvq\" (UniqueName: \"kubernetes.io/projected/a2e018f2-61d4-416d-8518-267ad15b615f-kube-api-access-92jvq\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608559 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608559 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-tmp-dir\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608559 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608497 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2e018f2-61d4-416d-8518-267ad15b615f-tls-certs\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608774 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-home\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608774 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-model-cache\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608872 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608850 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608917 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608897 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-tmp-dir\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.608980 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.608963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-home\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.610599 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.610579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-dshm\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.610955 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.610935 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2e018f2-61d4-416d-8518-267ad15b615f-tls-certs\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.616539 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.616514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jvq\" (UniqueName: \"kubernetes.io/projected/a2e018f2-61d4-416d-8518-267ad15b615f-kube-api-access-92jvq\") pod \"router-with-refs-pd-test-kserve-7768b7748-7n9p4\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.754028 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.753999 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:14.839587 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.839556 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq"] Apr 25 00:35:14.845201 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.845176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.856474 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.856438 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq"] Apr 25 00:35:14.886133 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.886078 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4"] Apr 25 00:35:14.888463 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:35:14.888435 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e018f2_61d4_416d_8518_267ad15b615f.slice/crio-33795ad22e06ab8e3186bafaa94ceffe45cb5c84db40bc96ea856996b3c0d57f WatchSource:0}: Error finding container 33795ad22e06ab8e3186bafaa94ceffe45cb5c84db40bc96ea856996b3c0d57f: Status 404 returned error can't find the container with id 33795ad22e06ab8e3186bafaa94ceffe45cb5c84db40bc96ea856996b3c0d57f Apr 25 00:35:14.911559 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.911694 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-home\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.911694 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.911694 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911682 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spj2t\" (UniqueName: \"kubernetes.io/projected/d7411fc6-e656-4cf0-b884-5f4901fd379f-kube-api-access-spj2t\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.911854 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911711 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7411fc6-e656-4cf0-b884-5f4901fd379f-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.911854 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911758 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:14.911854 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:14.911787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013182 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013182 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013363 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013363 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-home\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013363 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013566 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013370 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spj2t\" (UniqueName: \"kubernetes.io/projected/d7411fc6-e656-4cf0-b884-5f4901fd379f-kube-api-access-spj2t\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013566 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7411fc6-e656-4cf0-b884-5f4901fd379f-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013677 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013677 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013663 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013796 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.013907 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.013844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-home\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.016046 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.016027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.016376 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.016357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7411fc6-e656-4cf0-b884-5f4901fd379f-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.021134 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.021117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spj2t\" (UniqueName: \"kubernetes.io/projected/d7411fc6-e656-4cf0-b884-5f4901fd379f-kube-api-access-spj2t\") pod \"router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.161771 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.161736 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:35:15.305864 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.305832 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq"] Apr 25 00:35:15.307101 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:35:15.307069 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7411fc6_e656_4cf0_b884_5f4901fd379f.slice/crio-ac1c3e8bac7460b51a04a8cd62ed773b7a930295664e523c475eca486efb77d7 WatchSource:0}: Error finding container ac1c3e8bac7460b51a04a8cd62ed773b7a930295664e523c475eca486efb77d7: Status 404 returned error can't find the container with id ac1c3e8bac7460b51a04a8cd62ed773b7a930295664e523c475eca486efb77d7 Apr 25 00:35:15.355475 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.355438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerStarted","Data":"33795ad22e06ab8e3186bafaa94ceffe45cb5c84db40bc96ea856996b3c0d57f"} Apr 25 00:35:15.356619 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:15.356591 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" event={"ID":"d7411fc6-e656-4cf0-b884-5f4901fd379f","Type":"ContainerStarted","Data":"ac1c3e8bac7460b51a04a8cd62ed773b7a930295664e523c475eca486efb77d7"} Apr 25 00:35:16.363607 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:16.363568 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" event={"ID":"d7411fc6-e656-4cf0-b884-5f4901fd379f","Type":"ContainerStarted","Data":"ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540"} Apr 25 00:35:16.365064 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:16.365041 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerStarted","Data":"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4"} Apr 25 00:35:16.365233 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:16.365212 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:17.370823 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:17.370788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerStarted","Data":"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e"} Apr 25 00:35:21.390123 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:21.390088 2569 generic.go:358] "Generic (PLEG): container finished" podID="a2e018f2-61d4-416d-8518-267ad15b615f" containerID="6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e" exitCode=0 Apr 25 00:35:21.390622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:21.390164 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerDied","Data":"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e"} Apr 25 00:35:22.397672 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:22.397634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerStarted","Data":"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a"} Apr 25 00:35:22.419861 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:22.419804 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podStartSLOduration=7.568512132 podStartE2EDuration="8.419787944s" podCreationTimestamp="2026-04-25 00:35:14 +0000 UTC" firstStartedPulling="2026-04-25 00:35:14.89031614 +0000 UTC m=+2493.495042717" lastFinishedPulling="2026-04-25 00:35:15.741591953 +0000 UTC m=+2494.346318529" observedRunningTime="2026-04-25 00:35:22.41872457 +0000 UTC m=+2501.023451172" watchObservedRunningTime="2026-04-25 00:35:22.419787944 +0000 UTC m=+2501.024514543" Apr 25 00:35:24.754261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:24.754211 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:24.754261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:24.754251 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:24.755370 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:24.755339 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:35:28.972706 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:28.972666 2569 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" secret="" err="secret \"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-bc7j6\" not found" Apr 25 00:35:29.151322 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:29.151288 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:29.151526 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:29.151371 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs podName:5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7 nodeName:}" failed. No retries permitted until 2026-04-25 00:35:29.651349239 +0000 UTC m=+2508.256075817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:29.657415 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:29.657366 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:29.657597 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:29.657459 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs podName:5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7 nodeName:}" failed. No retries permitted until 2026-04-25 00:35:30.657443273 +0000 UTC m=+2509.262169849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:30.667977 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:30.667932 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:30.668489 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:30.668001 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs podName:5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7 nodeName:}" failed. No retries permitted until 2026-04-25 00:35:32.667987053 +0000 UTC m=+2511.272713629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:32.687979 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:32.687899 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:32.688341 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:32.687980 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs podName:5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7 nodeName:}" failed. No retries permitted until 2026-04-25 00:35:36.68795877 +0000 UTC m=+2515.292685347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:34.754648 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:34.754605 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:35:34.771609 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:34.771584 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:35:36.724878 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:36.724844 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:36.725279 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:36.724929 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs podName:5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7 nodeName:}" failed. No retries permitted until 2026-04-25 00:35:44.724910773 +0000 UTC m=+2523.329637369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 25 00:35:37.026370 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.026332 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 25 00:35:37.026639 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.026615 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerName="main" containerID="cri-o://bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f" gracePeriod=30 Apr 25 00:35:37.795813 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.795792 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:35:37.832783 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.832704 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-model-cache\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.832783 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.832755 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.833008 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.832790 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-dshm\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.833008 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.832809 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-home\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.833008 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.832840 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mk8g\" (UniqueName: \"kubernetes.io/projected/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kube-api-access-4mk8g\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.833178 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.833004 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-model-cache" (OuterVolumeSpecName: "model-cache") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:37.833178 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.833085 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kserve-provision-location\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.833178 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.833132 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tmp-dir\") pod \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\" (UID: \"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7\") " Apr 25 00:35:37.833507 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.833454 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-home" (OuterVolumeSpecName: "home") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:37.833507 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.833478 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:37.835211 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.835183 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kube-api-access-4mk8g" (OuterVolumeSpecName: "kube-api-access-4mk8g") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "kube-api-access-4mk8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:35:37.836123 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.836100 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-dshm" (OuterVolumeSpecName: "dshm") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:37.836225 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.836201 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:35:37.846147 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.846122 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:37.875180 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.875144 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" (UID: "5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:37.934498 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.934473 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:37.934498 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.934500 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:37.934679 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.934515 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:37.934679 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.934526 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:37.934679 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.934539 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:37.934679 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:37.934551 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mk8g\" (UniqueName: \"kubernetes.io/projected/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7-kube-api-access-4mk8g\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:35:38.461858 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.461825 2569 generic.go:358] "Generic (PLEG): container finished" podID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerID="bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f" exitCode=0 Apr 25 00:35:38.462046 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.461900 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 25 00:35:38.462046 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.461914 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7","Type":"ContainerDied","Data":"bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f"} Apr 25 00:35:38.462046 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.461961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7","Type":"ContainerDied","Data":"5b460758593f349a7646f6cff7ab71f8a5c14850aea27082b2481d10696b403c"} Apr 25 00:35:38.462046 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.461984 2569 scope.go:117] "RemoveContainer" containerID="bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f" Apr 25 00:35:38.471379 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.471361 2569 scope.go:117] "RemoveContainer" containerID="b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94" Apr 25 00:35:38.482114 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.482087 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 25 00:35:38.489307 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.489279 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 25 00:35:38.518546 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.518524 2569 scope.go:117] "RemoveContainer" containerID="bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f" Apr 25 00:35:38.518840 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:38.518820 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f\": container with ID starting with bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f not found: ID does not exist" containerID="bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f" Apr 25 00:35:38.518923 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.518851 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f"} err="failed to get container status \"bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f\": rpc error: code = NotFound desc = could not find container \"bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f\": container with ID starting with bf31f46bfd356fddd714d62942cc2b97b2093c1053ae41fdb0d6d68d1825727f not found: ID does not exist" Apr 25 00:35:38.518923 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.518871 2569 scope.go:117] "RemoveContainer" containerID="b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94" Apr 25 00:35:38.519151 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:35:38.519125 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94\": container with ID starting with b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94 not found: ID does not exist" containerID="b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94" Apr 25 00:35:38.519196 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:38.519166 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94"} err="failed to get container status \"b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94\": rpc error: code = NotFound desc = could not find container \"b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94\": container with ID starting with b8c0295472295d4524c16174fc4fc826dc4a2ae6b8ad2575c8407f19d31b0f94 not found: ID does not exist" Apr 25 00:35:39.979342 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:39.979306 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" path="/var/lib/kubelet/pods/5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7/volumes" Apr 25 00:35:44.754591 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:44.754532 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:35:54.754965 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:35:54.754914 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:36:04.755210 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:04.755155 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:36:14.755364 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:14.755313 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:36:24.754484 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:24.754437 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:36:34.755174 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:34.755122 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:36:44.754972 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:44.754924 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.59:8001/health\": dial tcp 10.133.0.59:8001: connect: connection refused" Apr 25 00:36:54.769889 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:54.769859 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:36:54.781871 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:36:54.781848 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:50:09.189591 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:09.189533 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4"] Apr 25 00:50:09.192375 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:09.190606 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" containerID="cri-o://276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a" gracePeriod=30 Apr 25 00:50:09.192974 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:09.192952 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq"] Apr 25 00:50:09.193242 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:09.193200 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" podUID="d7411fc6-e656-4cf0-b884-5f4901fd379f" containerName="storage-initializer" containerID="cri-o://ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540" gracePeriod=30 Apr 25 00:50:24.749372 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.749341 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:24.782861 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.782834 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:24.792909 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.792885 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:24.799395 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.799374 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:24.822109 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.822072 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:24.890269 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.890245 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:24.907940 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.907918 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:24.915921 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:24.915901 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:25.896223 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:25.896193 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:25.918573 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:25.918546 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:25.928727 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:25.928706 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:25.935496 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:25.935472 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:25.960034 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:25.960013 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:26.002704 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:26.002679 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:26.020356 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:26.020331 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:26.026904 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:26.026881 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:27.001278 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.001249 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:27.022744 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.022710 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:27.032519 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.032490 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:27.039201 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.039183 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:27.063542 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.063519 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:27.105205 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.105183 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:27.123356 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.123328 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:27.130124 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:27.130106 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:28.078464 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.078437 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:28.099277 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.099246 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:28.109374 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.109350 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:28.115483 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.115464 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:28.139529 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.139508 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:28.180115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.180090 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:28.198287 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.198264 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:28.206006 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:28.205988 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:29.178365 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.178333 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:29.198914 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.198890 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:29.207857 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.207832 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:29.213480 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.213461 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:29.238409 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.238382 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:29.276291 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.276267 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:29.294168 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.294146 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:29.301958 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:29.301924 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:30.256117 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.256083 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:30.277445 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.277396 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:30.288038 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.288012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:30.293355 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.293332 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:30.317497 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.317478 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:30.353081 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.353063 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:30.370254 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.370234 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:30.377950 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:30.377927 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:31.338616 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.338531 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:31.359126 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.359104 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:31.376301 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.376280 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:31.382969 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.382937 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:31.405330 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.405309 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:31.451818 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.451787 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:31.469021 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.468997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:31.474272 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:31.474256 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:32.429471 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.429436 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:32.449437 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.449414 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:32.459313 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.459290 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:32.465859 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.465833 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:32.516939 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.516919 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:32.555753 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.555723 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:32.573815 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.573791 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:32.581306 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:32.581285 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:33.536037 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.535985 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:33.557910 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.557865 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:33.567429 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.567391 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:33.573355 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.573334 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:33.597556 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.597534 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:33.637393 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.637369 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:33.655557 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.655535 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:33.662542 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:33.662524 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:34.638691 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.638662 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:34.657202 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.657171 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:34.672357 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.672334 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:34.678733 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.678710 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:34.701874 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.701846 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:34.744070 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.744047 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:34.763296 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.763262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:34.769181 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:34.769159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:35.754850 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.754825 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:35.777622 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.777590 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:35.787394 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.787364 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:35.794350 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.794325 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:35.819973 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.819941 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:35.859472 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.859451 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:35.877245 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.877223 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:35.886104 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:35.886086 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:36.912462 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:36.912432 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:36.945494 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:36.945467 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:36.955271 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:36.955246 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:36.961662 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:36.961629 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:36.984566 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:36.984546 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:37.024129 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:37.024104 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:37.042063 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:37.042039 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:37.047565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:37.047540 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:38.018049 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.017997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:38.038277 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.038248 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:38.047789 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.047759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:38.054280 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.054129 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:38.077584 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.077558 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:38.117448 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.117427 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:38.135486 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.135456 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:38.141717 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:38.141699 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:39.111342 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.111313 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-9ft6r_2f5d2ff8-3f89-412b-98e7-44da88df0a7a/istio-proxy/0.log" Apr 25 00:50:39.158914 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.158885 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:39.191505 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.191449 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="llm-d-routing-sidecar" containerID="cri-o://cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4" gracePeriod=2 Apr 25 00:50:39.221196 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.221167 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/llm-d-routing-sidecar/0.log" Apr 25 00:50:39.245890 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.245540 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/storage-initializer/0.log" Apr 25 00:50:39.269251 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.269229 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:39.309331 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.309305 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/main/0.log" Apr 25 00:50:39.329476 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.329366 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/tokenizer/0.log" Apr 25 00:50:39.335540 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.335342 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-55694477d7-wnrs2_b8941e9d-c1ba-4e54-a62b-492705d58230/storage-initializer/0.log" Apr 25 00:50:39.456299 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.456273 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:39.456446 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.456344 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:50:39.472537 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.472520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:39.473119 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.473103 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:50:39.505193 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505168 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-home\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.505334 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505243 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-dshm\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505334 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505272 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spj2t\" (UniqueName: \"kubernetes.io/projected/d7411fc6-e656-4cf0-b884-5f4901fd379f-kube-api-access-spj2t\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.505334 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505299 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-home\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505334 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505322 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-kserve-provision-location\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505355 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7411fc6-e656-4cf0-b884-5f4901fd379f-tls-certs\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505392 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-model-cache\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505445 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2e018f2-61d4-416d-8518-267ad15b615f-tls-certs\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505466 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-dshm\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505467 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-home" (OuterVolumeSpecName: "home") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505502 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-tmp-dir\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505565 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505565 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-model-cache\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.505961 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505613 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jvq\" (UniqueName: \"kubernetes.io/projected/a2e018f2-61d4-416d-8518-267ad15b615f-kube-api-access-92jvq\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505961 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505666 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-kserve-provision-location\") pod \"a2e018f2-61d4-416d-8518-267ad15b615f\" (UID: \"a2e018f2-61d4-416d-8518-267ad15b615f\") " Apr 25 00:50:39.505961 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.505695 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-tmp-dir\") pod \"d7411fc6-e656-4cf0-b884-5f4901fd379f\" (UID: \"d7411fc6-e656-4cf0-b884-5f4901fd379f\") " Apr 25 00:50:39.506112 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.506038 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.506202 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.506178 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-home" (OuterVolumeSpecName: "home") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.508086 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.508058 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-dshm" (OuterVolumeSpecName: "dshm") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.508529 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.508278 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.508529 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.508518 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-model-cache" (OuterVolumeSpecName: "model-cache") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.509286 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.509208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-model-cache" (OuterVolumeSpecName: "model-cache") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.511525 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.511487 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e018f2-61d4-416d-8518-267ad15b615f-kube-api-access-92jvq" (OuterVolumeSpecName: "kube-api-access-92jvq") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "kube-api-access-92jvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:50:39.511635 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.511533 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-dshm" (OuterVolumeSpecName: "dshm") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.511635 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.511596 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7411fc6-e656-4cf0-b884-5f4901fd379f-kube-api-access-spj2t" (OuterVolumeSpecName: "kube-api-access-spj2t") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "kube-api-access-spj2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:50:39.511635 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.511624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7411fc6-e656-4cf0-b884-5f4901fd379f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:50:39.512163 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.512142 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2e018f2-61d4-416d-8518-267ad15b615f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:50:39.525601 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.525568 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.527841 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.527819 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7411fc6-e656-4cf0-b884-5f4901fd379f" (UID: "d7411fc6-e656-4cf0-b884-5f4901fd379f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.566152 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.566125 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2e018f2-61d4-416d-8518-267ad15b615f" (UID: "a2e018f2-61d4-416d-8518-267ad15b615f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:39.607469 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607446 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607469 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607467 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607477 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92jvq\" (UniqueName: \"kubernetes.io/projected/a2e018f2-61d4-416d-8518-267ad15b615f-kube-api-access-92jvq\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607486 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607498 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-tmp-dir\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607511 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607523 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spj2t\" (UniqueName: \"kubernetes.io/projected/d7411fc6-e656-4cf0-b884-5f4901fd379f-kube-api-access-spj2t\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607531 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-home\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607544 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-kserve-provision-location\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607552 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7411fc6-e656-4cf0-b884-5f4901fd379f-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607560 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2e018f2-61d4-416d-8518-267ad15b615f-model-cache\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607568 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2e018f2-61d4-416d-8518-267ad15b615f-tls-certs\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.607608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.607575 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7411fc6-e656-4cf0-b884-5f4901fd379f-dshm\") on node \"ip-10-0-133-214.ec2.internal\" DevicePath \"\"" Apr 25 00:50:39.919612 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.919584 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq_d7411fc6-e656-4cf0-b884-5f4901fd379f/storage-initializer/0.log" Apr 25 00:50:39.919799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.919628 2569 generic.go:358] "Generic (PLEG): container finished" podID="d7411fc6-e656-4cf0-b884-5f4901fd379f" containerID="ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540" exitCode=137 Apr 25 00:50:39.919799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.919709 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" Apr 25 00:50:39.919799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.919717 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" event={"ID":"d7411fc6-e656-4cf0-b884-5f4901fd379f","Type":"ContainerDied","Data":"ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540"} Apr 25 00:50:39.919799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.919756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq" event={"ID":"d7411fc6-e656-4cf0-b884-5f4901fd379f","Type":"ContainerDied","Data":"ac1c3e8bac7460b51a04a8cd62ed773b7a930295664e523c475eca486efb77d7"} Apr 25 00:50:39.919799 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.919772 2569 scope.go:117] "RemoveContainer" containerID="ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540" Apr 25 00:50:39.921329 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.921311 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7768b7748-7n9p4_a2e018f2-61d4-416d-8518-267ad15b615f/main/0.log" Apr 25 00:50:39.921963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.921936 2569 generic.go:358] "Generic (PLEG): container finished" podID="a2e018f2-61d4-416d-8518-267ad15b615f" containerID="276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a" exitCode=137 Apr 25 00:50:39.921963 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.921961 2569 generic.go:358] "Generic (PLEG): container finished" podID="a2e018f2-61d4-416d-8518-267ad15b615f" containerID="cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4" exitCode=0 Apr 25 00:50:39.922101 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.921992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerDied","Data":"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a"} Apr 25 00:50:39.922101 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.922037 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerDied","Data":"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4"} Apr 25 00:50:39.922101 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.922063 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" event={"ID":"a2e018f2-61d4-416d-8518-267ad15b615f","Type":"ContainerDied","Data":"33795ad22e06ab8e3186bafaa94ceffe45cb5c84db40bc96ea856996b3c0d57f"} Apr 25 00:50:39.922101 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.922008 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4" Apr 25 00:50:39.940076 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.940059 2569 scope.go:117] "RemoveContainer" containerID="ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540" Apr 25 00:50:39.940339 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:50:39.940320 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540\": container with ID starting with ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540 not found: ID does not exist" containerID="ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540" Apr 25 00:50:39.940426 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.940345 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540"} err="failed to get container status \"ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540\": rpc error: code = NotFound desc = could not find container \"ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540\": container with ID starting with ec95f2ff0ae88858a69e3ca963960204b4e93016a3542d54affb3474398fa540 not found: ID does not exist" Apr 25 00:50:39.940426 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.940361 2569 scope.go:117] "RemoveContainer" containerID="276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a" Apr 25 00:50:39.949696 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.949588 2569 scope.go:117] "RemoveContainer" containerID="6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e" Apr 25 00:50:39.959369 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.959341 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq"] Apr 25 00:50:39.962232 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.962210 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-694999c9f5-txkqq"] Apr 25 00:50:39.974877 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.972951 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4"] Apr 25 00:50:39.984112 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.984075 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7411fc6-e656-4cf0-b884-5f4901fd379f" path="/var/lib/kubelet/pods/d7411fc6-e656-4cf0-b884-5f4901fd379f/volumes" Apr 25 00:50:39.984472 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:39.984455 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7768b7748-7n9p4"] Apr 25 00:50:40.015523 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.015494 2569 scope.go:117] "RemoveContainer" containerID="cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4" Apr 25 00:50:40.023904 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.023887 2569 scope.go:117] "RemoveContainer" containerID="276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a" Apr 25 00:50:40.024133 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:50:40.024112 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a\": container with ID starting with 276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a not found: ID does not exist" containerID="276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a" Apr 25 00:50:40.024204 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024140 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a"} err="failed to get container status \"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a\": rpc error: code = NotFound desc = could not find container \"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a\": container with ID starting with 276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a not found: ID does not exist" Apr 25 00:50:40.024204 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024161 2569 scope.go:117] "RemoveContainer" containerID="6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e" Apr 25 00:50:40.024373 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:50:40.024358 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e\": container with ID starting with 6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e not found: ID does not exist" containerID="6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e" Apr 25 00:50:40.024437 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024375 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e"} err="failed to get container status \"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e\": rpc error: code = NotFound desc = could not find container \"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e\": container with ID starting with 6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e not found: ID does not exist" Apr 25 00:50:40.024437 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024387 2569 scope.go:117] "RemoveContainer" containerID="cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4" Apr 25 00:50:40.024661 ip-10-0-133-214 kubenswrapper[2569]: E0425 00:50:40.024638 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4\": container with ID starting with cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4 not found: ID does not exist" containerID="cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4" Apr 25 00:50:40.024705 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024668 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4"} err="failed to get container status \"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4\": rpc error: code = NotFound desc = could not find container \"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4\": container with ID starting with cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4 not found: ID does not exist" Apr 25 00:50:40.024705 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024685 2569 scope.go:117] "RemoveContainer" containerID="276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a" Apr 25 00:50:40.024889 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024873 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a"} err="failed to get container status \"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a\": rpc error: code = NotFound desc = could not find container \"276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a\": container with ID starting with 276182a6c0710195e6549148f9bb61686bc217394bff5281fcb18aded45dba9a not found: ID does not exist" Apr 25 00:50:40.024934 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.024890 2569 scope.go:117] "RemoveContainer" containerID="6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e" Apr 25 00:50:40.025068 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.025053 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e"} err="failed to get container status \"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e\": rpc error: code = NotFound desc = could not find container \"6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e\": container with ID starting with 6fdd0aa1201383381842e2bd7129cfcc10200bae8a5cb8d0106b37b78203739e not found: ID does not exist" Apr 25 00:50:40.025166 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.025067 2569 scope.go:117] "RemoveContainer" containerID="cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4" Apr 25 00:50:40.025286 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.025264 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4"} err="failed to get container status \"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4\": rpc error: code = NotFound desc = could not find container \"cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4\": container with ID starting with cbc34650c7cdaaf8a2eae406c3de98806dc333c69a4fc47ace52d689148629f4 not found: ID does not exist" Apr 25 00:50:40.365237 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:40.365208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2lzjq_58046be0-da36-4df4-af44-0f6f68c596ea/discovery/0.log" Apr 25 00:50:41.258264 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:41.258234 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2lzjq_58046be0-da36-4df4-af44-0f6f68c596ea/discovery/0.log" Apr 25 00:50:41.977885 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:41.977853 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" path="/var/lib/kubelet/pods/a2e018f2-61d4-416d-8518-267ad15b615f/volumes" Apr 25 00:50:42.110801 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:42.110759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2lzjq_58046be0-da36-4df4-af44-0f6f68c596ea/discovery/0.log" Apr 25 00:50:42.875735 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:42.875700 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-mzz8b_ee6bf646-424e-4677-afbe-0c00df0548c7/manager/0.log" Apr 25 00:50:42.884754 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:42.884733 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rkxq5_7c3b8652-c669-4e33-ae9c-b60cfae028c4/manager/0.log" Apr 25 00:50:42.975109 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:42.975075 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-645kp_2d813bc9-0850-4c30-9d76-eef15935a388/manager/0.log" Apr 25 00:50:43.805915 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:43.805883 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-mzz8b_ee6bf646-424e-4677-afbe-0c00df0548c7/manager/0.log" Apr 25 00:50:43.817165 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:43.817141 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rkxq5_7c3b8652-c669-4e33-ae9c-b60cfae028c4/manager/0.log" Apr 25 00:50:43.904027 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:43.904000 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-645kp_2d813bc9-0850-4c30-9d76-eef15935a388/manager/0.log" Apr 25 00:50:44.721299 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:44.721266 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-mzz8b_ee6bf646-424e-4677-afbe-0c00df0548c7/manager/0.log" Apr 25 00:50:44.731985 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:44.731963 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rkxq5_7c3b8652-c669-4e33-ae9c-b60cfae028c4/manager/0.log" Apr 25 00:50:44.815344 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:44.815315 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-645kp_2d813bc9-0850-4c30-9d76-eef15935a388/manager/0.log" Apr 25 00:50:45.616295 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:45.616261 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-mzz8b_ee6bf646-424e-4677-afbe-0c00df0548c7/manager/0.log" Apr 25 00:50:45.627034 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:45.627012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rkxq5_7c3b8652-c669-4e33-ae9c-b60cfae028c4/manager/0.log" Apr 25 00:50:45.707072 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:45.707045 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-645kp_2d813bc9-0850-4c30-9d76-eef15935a388/manager/0.log" Apr 25 00:50:46.519787 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:46.519758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-mzz8b_ee6bf646-424e-4677-afbe-0c00df0548c7/manager/0.log" Apr 25 00:50:46.529995 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:46.529966 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rkxq5_7c3b8652-c669-4e33-ae9c-b60cfae028c4/manager/0.log" Apr 25 00:50:46.612686 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:46.612657 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-645kp_2d813bc9-0850-4c30-9d76-eef15935a388/manager/0.log" Apr 25 00:50:51.694140 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:51.694110 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pwpfb_d13d238f-ed70-4223-a161-a53a31be9a63/global-pull-secret-syncer/0.log" Apr 25 00:50:51.738309 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:51.738277 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2r7xx_a9b41727-9688-4980-8671-a860f9ccf954/konnectivity-agent/0.log" Apr 25 00:50:51.870261 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:51.870228 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-214.ec2.internal_413e1dcc7a9acaef383b6c159ccd3bed/haproxy/0.log" Apr 25 00:50:55.857536 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:55.857508 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-mzz8b_ee6bf646-424e-4677-afbe-0c00df0548c7/manager/0.log" Apr 25 00:50:55.889056 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:55.889028 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-rkxq5_7c3b8652-c669-4e33-ae9c-b60cfae028c4/manager/0.log" Apr 25 00:50:56.026114 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:56.026083 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-645kp_2d813bc9-0850-4c30-9d76-eef15935a388/manager/0.log" Apr 25 00:50:57.347736 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:57.347703 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bdlrw_b67d7b0e-0fdf-4585-a96e-98063b80e4c3/node-exporter/0.log" Apr 25 00:50:57.366848 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:57.366815 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bdlrw_b67d7b0e-0fdf-4585-a96e-98063b80e4c3/kube-rbac-proxy/0.log" Apr 25 00:50:57.388739 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:57.388717 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bdlrw_b67d7b0e-0fdf-4585-a96e-98063b80e4c3/init-textfile/0.log" Apr 25 00:50:57.979910 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:50:57.979858 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-x8pb6_499f22b6-5f9c-4b8d-9958-51fbade0900a/prometheus-operator-admission-webhook/0.log" Apr 25 00:51:00.370627 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.370598 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f77b8c9c8-vch7j_31d18e68-4a12-4c04-aeee-75cd50a3dd79/console/0.log" Apr 25 00:51:00.414322 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.414290 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ddrd5_582002e5-6e19-4c7c-afa8-2c680db672f4/download-server/0.log" Apr 25 00:51:00.505027 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.504995 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf"] Apr 25 00:51:00.505371 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505359 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="llm-d-routing-sidecar" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505373 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="llm-d-routing-sidecar" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505381 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505386 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505421 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerName="main" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505428 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerName="main" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505438 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7411fc6-e656-4cf0-b884-5f4901fd379f" containerName="storage-initializer" Apr 25 00:51:00.505441 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505443 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7411fc6-e656-4cf0-b884-5f4901fd379f" containerName="storage-initializer" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505455 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="storage-initializer" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505460 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="storage-initializer" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505470 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerName="storage-initializer" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505475 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerName="storage-initializer" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505532 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="main" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505541 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7411fc6-e656-4cf0-b884-5f4901fd379f" containerName="storage-initializer" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505548 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f1f9e22-5f86-4ea2-aae9-f1dd0a1c9ce7" containerName="main" Apr 25 00:51:00.505674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.505554 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2e018f2-61d4-416d-8518-267ad15b615f" containerName="llm-d-routing-sidecar" Apr 25 00:51:00.508696 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.508679 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.511123 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.511104 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dtbkd\"/\"default-dockercfg-svszf\"" Apr 25 00:51:00.511222 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.511107 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dtbkd\"/\"kube-root-ca.crt\"" Apr 25 00:51:00.511760 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.511744 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dtbkd\"/\"openshift-service-ca.crt\"" Apr 25 00:51:00.518820 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.518797 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf"] Apr 25 00:51:00.578185 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.578159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-proc\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.578344 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.578195 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8snqh\" (UniqueName: \"kubernetes.io/projected/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-kube-api-access-8snqh\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.578344 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.578220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-sys\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.578344 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.578307 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-podres\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.578507 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.578360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-lib-modules\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679555 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-sys\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679555 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-podres\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-lib-modules\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-proc\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-sys\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8snqh\" (UniqueName: \"kubernetes.io/projected/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-kube-api-access-8snqh\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679710 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-podres\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679892 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-proc\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.679892 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.679737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-lib-modules\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.687338 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.687314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8snqh\" (UniqueName: \"kubernetes.io/projected/9f87a5e6-370e-4b3c-a8de-1e1625fae19d-kube-api-access-8snqh\") pod \"perf-node-gather-daemonset-2spnf\" (UID: \"9f87a5e6-370e-4b3c-a8de-1e1625fae19d\") " pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.819577 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.819549 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:00.941257 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.941229 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf"] Apr 25 00:51:00.942988 ip-10-0-133-214 kubenswrapper[2569]: W0425 00:51:00.942959 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9f87a5e6_370e_4b3c_a8de_1e1625fae19d.slice/crio-738990bd681a1a144c31d3f911dc00f5a9e52652c952f4750e3fa468a1dbc3d3 WatchSource:0}: Error finding container 738990bd681a1a144c31d3f911dc00f5a9e52652c952f4750e3fa468a1dbc3d3: Status 404 returned error can't find the container with id 738990bd681a1a144c31d3f911dc00f5a9e52652c952f4750e3fa468a1dbc3d3 Apr 25 00:51:00.944561 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:00.944544 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:51:01.007794 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:01.007764 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" event={"ID":"9f87a5e6-370e-4b3c-a8de-1e1625fae19d","Type":"ContainerStarted","Data":"738990bd681a1a144c31d3f911dc00f5a9e52652c952f4750e3fa468a1dbc3d3"} Apr 25 00:51:01.622537 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:01.622500 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ckvqn_586d6de3-4325-4b34-af6a-576dc929fdce/dns/0.log" Apr 25 00:51:01.641989 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:01.641965 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ckvqn_586d6de3-4325-4b34-af6a-576dc929fdce/kube-rbac-proxy/0.log" Apr 25 00:51:01.778415 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:01.778387 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pfmxs_191956df-226e-4581-bd3c-7661b41d8536/dns-node-resolver/0.log" Apr 25 00:51:02.012923 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:02.012894 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" event={"ID":"9f87a5e6-370e-4b3c-a8de-1e1625fae19d","Type":"ContainerStarted","Data":"38bac7c32dc4aa90bcddc52b4e7c5e5a0c7e1c3b198e1df894ce016a8e317383"} Apr 25 00:51:02.013091 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:02.013063 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:02.031052 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:02.031009 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" podStartSLOduration=2.030995944 podStartE2EDuration="2.030995944s" podCreationTimestamp="2026-04-25 00:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:51:02.028124991 +0000 UTC m=+3440.632851586" watchObservedRunningTime="2026-04-25 00:51:02.030995944 +0000 UTC m=+3440.635722541" Apr 25 00:51:02.222586 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:02.222557 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29617920-jhs4w_86623852-f437-4e94-8774-6652dabed4fb/image-pruner/0.log" Apr 25 00:51:02.275535 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:02.275459 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76d68c7b68-hgbqp_25b6cdb5-aed3-4024-93b1-63dd4d5a7299/registry/0.log" Apr 25 00:51:02.342608 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:02.342576 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rsd5n_dd9e66f5-07c5-45ce-92ad-16649e745d96/node-ca/0.log" Apr 25 00:51:03.207319 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:03.207289 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-2lzjq_58046be0-da36-4df4-af44-0f6f68c596ea/discovery/0.log" Apr 25 00:51:03.731635 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:03.731601 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xvjhv_b819f9b3-e5fe-4501-9625-b73431d3105c/serve-healthcheck-canary/0.log" Apr 25 00:51:04.179930 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:04.179896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7xqlq_4d74dd05-34d6-465d-b59b-4694e782b05f/kube-rbac-proxy/0.log" Apr 25 00:51:04.200263 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:04.200235 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7xqlq_4d74dd05-34d6-465d-b59b-4694e782b05f/exporter/0.log" Apr 25 00:51:04.221493 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:04.221467 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7xqlq_4d74dd05-34d6-465d-b59b-4694e782b05f/extractor/0.log" Apr 25 00:51:07.415432 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:07.415389 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-64c4d9588d-pj2cx_240d3421-bb02-433c-8bfc-50a8cc3a6eff/manager/0.log" Apr 25 00:51:07.485068 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:07.485030 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5d945688d8-zcv7x_5cdea8df-86df-4d4f-81b2-d9d6df6c04d2/manager/0.log" Apr 25 00:51:07.507108 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:07.507086 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-5c775_a65f09a5-572e-4db9-87ca-a10c2ae08d7e/server/0.log" Apr 25 00:51:07.924417 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:07.924371 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6gqhc_3f125a41-7d58-42c0-86d8-9e89c5c6d9fd/manager/0.log" Apr 25 00:51:07.978422 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:07.978370 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-qcgmd_adc22d8c-100e-4e57-bcad-4319af5e0d4f/seaweedfs/0.log" Apr 25 00:51:08.026566 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:08.026536 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dtbkd/perf-node-gather-daemonset-2spnf" Apr 25 00:51:12.431910 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:12.431879 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cwh6l_6b80eb80-c7b4-43b8-b29c-8db00b583a49/migrator/0.log" Apr 25 00:51:12.452808 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:12.452783 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cwh6l_6b80eb80-c7b4-43b8-b29c-8db00b583a49/graceful-termination/0.log" Apr 25 00:51:13.961053 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:13.961021 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/kube-multus-additional-cni-plugins/0.log" Apr 25 00:51:13.982252 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:13.982223 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/egress-router-binary-copy/0.log" Apr 25 00:51:14.001352 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.001325 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/cni-plugins/0.log" Apr 25 00:51:14.020511 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.020487 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/bond-cni-plugin/0.log" Apr 25 00:51:14.039515 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.039491 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/routeoverride-cni/0.log" Apr 25 00:51:14.059115 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.059084 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/whereabouts-cni-bincopy/0.log" Apr 25 00:51:14.077797 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.077774 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gv8rj_cfab2e9e-eb84-4b70-bc59-197bc3f27fb6/whereabouts-cni/0.log" Apr 25 00:51:14.327850 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.327821 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pk5cg_04a809d7-5a4b-4d1b-b069-41b0cd06e320/kube-multus/0.log" Apr 25 00:51:14.394336 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.394311 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-npwg7_37e765cb-b1c9-4330-ac47-4918ba2ebf0a/network-metrics-daemon/0.log" Apr 25 00:51:14.411015 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:14.410989 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-npwg7_37e765cb-b1c9-4330-ac47-4918ba2ebf0a/kube-rbac-proxy/0.log" Apr 25 00:51:16.030597 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.030570 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/ovn-controller/0.log" Apr 25 00:51:16.075053 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.075020 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/ovn-acl-logging/0.log" Apr 25 00:51:16.094263 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.094239 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/kube-rbac-proxy-node/0.log" Apr 25 00:51:16.113148 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.113122 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:51:16.133083 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.133058 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/northd/0.log" Apr 25 00:51:16.152802 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.152778 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/nbdb/0.log" Apr 25 00:51:16.172110 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.172086 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/sbdb/0.log" Apr 25 00:51:16.371464 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:16.371381 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xnqmd_0e517da2-c437-430c-aec1-02e2d22665ca/ovnkube-controller/0.log" Apr 25 00:51:17.498942 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:17.498910 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lsbf2_44764fbc-9742-4d96-ae9e-d45956e60888/network-check-target-container/0.log" Apr 25 00:51:18.443674 ip-10-0-133-214 kubenswrapper[2569]: I0425 00:51:18.443618 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5s57p_92d3f68f-64da-4914-9d0e-66109d7ac351/iptables-alerter/0.log"