Apr 16 17:37:41.420694 ip-10-0-141-32 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 17:37:41.420702 ip-10-0-141-32 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 17:37:41.420709 ip-10-0-141-32 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 17:37:41.420928 ip-10-0-141-32 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 17:37:51.541661 ip-10-0-141-32 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 17:37:51.541677 ip-10-0-141-32 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1417a3cdefe84c2886062c0f27ee2211 -- Apr 16 17:40:01.849203 ip-10-0-141-32 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:40:02.346111 ip-10-0-141-32 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:02.346111 ip-10-0-141-32 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:40:02.346111 ip-10-0-141-32 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:02.346111 ip-10-0-141-32 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:40:02.346111 ip-10-0-141-32 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:02.348077 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.347919 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:40:02.351413 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351392 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:02.351413 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351412 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351416 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351420 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351422 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351427 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351431 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351435 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351438 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351440 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351443 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351447 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351450 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351452 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351455 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351458 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351467 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351470 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351473 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351476 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351479 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:02.351483 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351481 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351485 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351488 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351492 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351495 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351498 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351501 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351503 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351506 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351508 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351511 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351514 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351518 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351522 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351525 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351528 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351530 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351533 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351535 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:02.351980 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351538 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351541 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351543 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351546 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351549 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351551 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351554 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351556 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351560 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351564 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351566 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351569 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351572 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351574 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351577 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351581 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351585 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351588 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351591 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:02.352471 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351593 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351596 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351600 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351602 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351605 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351607 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351610 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351612 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351615 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351618 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351620 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351623 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351626 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351628 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351631 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351634 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351636 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351639 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351641 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351644 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:02.352951 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351646 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351649 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351651 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351654 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351656 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351659 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.351662 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352129 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352137 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352140 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352143 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352146 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352149 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352152 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352154 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352157 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352160 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352163 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352165 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:02.353431 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352168 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352171 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352174 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352177 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352179 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352181 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352184 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352187 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352190 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352192 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352195 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352197 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352200 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352203 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352205 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352208 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352210 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352213 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352215 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352218 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:02.353896 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352221 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352224 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352226 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352229 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352232 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352235 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352237 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352240 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352243 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352245 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352248 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352250 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352253 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352256 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352258 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352260 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352263 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352265 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352268 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352271 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:02.354403 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352273 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352276 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352278 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352281 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352284 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352286 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352289 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352291 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352294 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352296 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352299 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352302 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352304 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352307 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352310 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352312 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352315 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352317 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352321 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352323 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:02.354978 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352328 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352331 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352334 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352336 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352339 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352341 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352344 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352347 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352351 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352354 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352357 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352360 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352363 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.352365 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352448 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352455 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352464 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352470 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352476 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352481 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352487 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:40:02.355500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352491 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352495 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352499 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352503 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352507 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352511 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352514 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352517 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352520 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352523 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352526 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352529 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352534 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352537 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352540 2576 flags.go:64] FLAG: --config-dir="" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352543 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352547 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352551 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352555 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352558 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352562 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352565 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352568 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352571 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352575 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:40:02.356035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352578 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352582 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352586 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352589 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352591 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352594 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352598 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352602 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352605 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352609 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352612 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352618 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352622 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352625 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352628 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352631 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352635 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352637 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352640 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352643 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352647 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352650 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352652 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352656 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352659 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:40:02.356645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352663 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352666 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352670 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352673 2576 flags.go:64] FLAG: --help="false" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352676 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352679 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352681 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352684 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352688 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352691 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352694 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352697 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352699 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352702 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352705 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352709 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352712 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352716 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352720 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352723 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352727 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352730 2576 flags.go:64] FLAG: --lock-file="" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352732 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352735 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:40:02.357258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352738 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352744 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352747 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352750 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352753 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352756 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352759 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352762 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352765 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352769 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352774 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352778 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352781 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352784 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352787 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352790 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352793 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352796 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352799 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352807 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352810 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352813 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352816 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:40:02.357849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352820 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352827 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352830 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352834 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352837 2576 flags.go:64] FLAG: --port="10250" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352841 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352844 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e5719540b664232c" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352847 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352850 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352853 2576 flags.go:64] FLAG: --register-node="true" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352856 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352859 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352863 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352866 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352868 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352871 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352875 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352878 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352881 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352884 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352890 2576 flags.go:64] FLAG: --runonce="false" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352894 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352897 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352900 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352902 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352917 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:40:02.358404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352921 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352924 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352927 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352930 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352933 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352936 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352939 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352942 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352946 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352954 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352960 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352963 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352967 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352971 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352974 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352977 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352980 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352983 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352986 2576 flags.go:64] FLAG: --v="2" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352991 2576 flags.go:64] FLAG: --version="false" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352995 2576 flags.go:64] FLAG: --vmodule="" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.352999 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.353002 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353104 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:02.359045 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353108 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353111 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353114 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353117 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353120 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353122 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353125 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353127 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353130 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353133 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353135 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353137 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353140 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353143 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353146 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353152 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353154 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353158 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353161 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353163 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:02.359620 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353166 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353169 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353173 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353177 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353181 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353185 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353188 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353192 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353195 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353197 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353200 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353203 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353206 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353209 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353211 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353214 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353217 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353219 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:02.360152 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353222 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353225 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353227 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353230 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353232 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353235 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353237 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353240 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353243 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353246 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353249 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353253 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353255 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353258 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353261 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353263 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353266 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353268 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353271 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353273 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:02.360604 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353276 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353278 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353281 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353283 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353286 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353289 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353291 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353293 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353296 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353299 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353301 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353304 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353307 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353309 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353312 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353315 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353317 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353320 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353322 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353325 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:02.361112 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353328 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353332 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353335 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353338 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353341 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353344 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.353346 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:02.361668 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.354335 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:02.362318 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.362295 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:40:02.362357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.362319 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:40:02.362388 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362371 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:02.362388 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362378 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:02.362388 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362381 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:02.362388 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362385 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:02.362388 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362388 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:02.362388 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362391 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362394 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362397 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362400 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362403 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362406 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362409 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362411 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362414 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362416 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362419 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362422 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362424 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362427 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362429 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362432 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362434 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362437 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362440 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362442 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:02.362546 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362445 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362447 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362450 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362454 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362459 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362462 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362466 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362469 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362471 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362474 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362476 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362479 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362482 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362485 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362488 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362491 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362493 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362497 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362501 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:02.363056 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362504 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362507 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362509 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362512 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362515 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362518 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362520 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362523 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362526 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362528 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362531 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362534 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362536 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362539 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362541 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362544 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362547 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362549 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362552 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362555 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:02.363522 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362559 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362562 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362565 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362567 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362570 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362573 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362576 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362578 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362581 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362583 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362586 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362589 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362592 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362595 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362597 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362600 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362603 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362605 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362608 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362611 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:02.364038 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362613 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362616 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.362621 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362726 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362731 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362734 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362738 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362741 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362744 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362747 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362750 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362754 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362757 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362760 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362763 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:02.364530 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362766 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362769 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362772 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362775 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362778 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362781 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362783 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362786 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362789 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362791 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362794 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362796 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362799 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362802 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362804 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362807 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362810 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362813 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362815 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362818 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:02.365000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362820 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362823 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362826 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362828 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362831 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362834 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362836 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362839 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362841 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362844 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362846 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362849 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362852 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362855 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362857 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362860 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362863 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362865 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362869 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362872 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:02.365491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362874 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362877 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362880 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362882 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362885 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362887 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362890 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362892 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362895 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362897 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362900 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362903 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362922 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362926 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362928 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362931 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362934 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362937 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362940 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:02.366000 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362942 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362945 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362948 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362950 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362953 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362957 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362959 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362962 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362966 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362970 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362973 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362976 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362978 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362981 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:02.362984 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:02.366509 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.362989 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:02.366897 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.363918 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:40:02.366897 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.366298 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:40:02.367459 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.367445 2576 server.go:1019] "Starting client certificate rotation" Apr 16 17:40:02.367561 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.367543 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:02.368434 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.368422 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:02.396524 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.396498 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:02.403920 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.403869 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:02.421432 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.421402 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:40:02.427823 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.427801 2576 log.go:25] "Validated CRI v1 image API" Apr 16 17:40:02.431534 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.431515 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:40:02.432221 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.432204 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:02.437009 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.436981 2576 fs.go:135] Filesystem UUIDs: map[1fc30240-f6f0-470c-92d9-7ef17d491aa5:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a39025e4-35fe-4d3e-9045-6e506d0b43f5:/dev/nvme0n1p3] Apr 16 17:40:02.437833 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.437806 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:40:02.443843 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.443708 2576 manager.go:217] Machine: {Timestamp:2026-04-16 17:40:02.442394932 +0000 UTC m=+0.458862443 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100907 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2335b42be5f9bebfcbdbd0a32fb7a5 SystemUUID:ec2335b4-2be5-f9be-bfcb-dbd0a32fb7a5 BootID:1417a3cd-efe8-4c28-8606-2c0f27ee2211 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e2:ba:25:fd:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e2:ba:25:fd:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:5c:2c:6f:b5:bf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:40:02.444635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.444622 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:40:02.444754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.444737 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:40:02.446101 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.446076 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:40:02.446283 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.446102 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-32.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:40:02.446361 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.446297 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:40:02.446361 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.446310 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:40:02.446361 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.446329 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:02.446361 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.446351 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:02.448262 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.448248 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:02.448393 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.448382 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:40:02.452074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.452060 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:40:02.452139 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.452089 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:40:02.452961 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.452950 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:40:02.453009 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.452974 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:40:02.453009 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.452991 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:40:02.453398 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.453377 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zvhww" Apr 16 17:40:02.454337 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.454323 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:02.454399 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.454349 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:02.459837 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.459810 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zvhww" Apr 16 17:40:02.460781 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.460757 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:40:02.462440 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.462395 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:40:02.462562 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.462512 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-32.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:40:02.463674 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.463657 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:40:02.465521 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465506 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465525 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465532 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465537 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465543 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465549 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465555 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465561 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465568 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465574 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:40:02.465578 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465583 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:40:02.465864 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.465592 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:40:02.467612 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.467598 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:40:02.467656 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.467614 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:40:02.471341 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.471325 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:40:02.471439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.471366 2576 server.go:1295] "Started kubelet" Apr 16 17:40:02.471475 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.471437 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:40:02.472182 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.472123 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:40:02.472269 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.472209 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:40:02.472187 ip-10-0-141-32 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:40:02.473627 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.473610 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:40:02.474221 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.474209 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:40:02.478112 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.478097 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-32.ec2.internal" not found Apr 16 17:40:02.480812 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.480795 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:02.481469 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.481449 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:40:02.481574 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.481499 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:40:02.482166 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482149 2576 factory.go:55] Registering systemd factory Apr 16 17:40:02.482166 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482166 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:40:02.482280 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482273 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:40:02.482315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482287 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:40:02.482352 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482272 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:40:02.482417 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482405 2576 factory.go:153] Registering CRI-O factory Apr 16 17:40:02.482450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482415 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:40:02.482450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482422 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 17:40:02.482450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482427 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:40:02.482571 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482502 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:40:02.482571 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.482511 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:02.482571 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482530 2576 factory.go:103] Registering Raw factory Apr 16 17:40:02.482686 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.482578 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 17:40:02.483280 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.483263 2576 manager.go:319] Starting recovery of all containers Apr 16 17:40:02.484035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.484019 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:02.488030 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.488005 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-32.ec2.internal\" not found" node="ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.495346 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.495322 2576 manager.go:324] Recovery completed Apr 16 17:40:02.495459 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.495374 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-32.ec2.internal" not found Apr 16 17:40:02.498177 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.498151 2576 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 17:40:02.501063 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.501050 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:02.503721 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.503704 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:02.503797 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.503737 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:02.503797 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.503749 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:02.504271 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.504252 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:40:02.504271 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.504267 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:40:02.504398 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.504287 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:02.508196 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.508183 2576 policy_none.go:49] "None policy: Start" Apr 16 17:40:02.508249 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.508201 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:40:02.508249 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.508211 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.553517 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.553564 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.553579 2576 server.go:85] "Starting device plugin registration server" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.553880 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.553895 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.554034 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.554128 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.554135 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.554626 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.554660 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:02.572024 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.559354 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-32.ec2.internal" not found Apr 16 17:40:02.620734 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.620645 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:40:02.621965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.621947 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:40:02.622040 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.621976 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:40:02.622040 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.621996 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:40:02.622040 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.622003 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:40:02.622040 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.622037 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:40:02.629610 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.629586 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:02.654893 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.654866 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:02.655873 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.655857 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:02.655957 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.655889 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:02.655957 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.655899 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:02.655957 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.655941 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.664763 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.664743 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.664819 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.664772 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-32.ec2.internal\": node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:02.679735 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.679704 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:02.722802 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.722771 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal"] Apr 16 17:40:02.722877 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.722853 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:02.724371 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.724354 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:02.724425 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.724388 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:02.724425 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.724402 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:02.726682 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.726667 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:02.726836 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.726822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.726882 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.726851 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:02.727406 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.727388 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:02.727473 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.727420 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:02.727473 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.727430 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:02.727473 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.727391 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:02.727564 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.727491 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:02.727564 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.727504 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:02.730080 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.730062 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.730148 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.730101 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:02.730899 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.730877 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:02.730988 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.730921 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:02.730988 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.730942 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:02.752624 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.752604 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-32.ec2.internal\" not found" node="ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.757072 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.757053 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-32.ec2.internal\" not found" node="ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.780338 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.780307 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:02.784178 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.784155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71ae8018aa9dd49bcac3f08161afc28d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal\" (UID: \"71ae8018aa9dd49bcac3f08161afc28d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.784235 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.784191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71ae8018aa9dd49bcac3f08161afc28d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal\" (UID: \"71ae8018aa9dd49bcac3f08161afc28d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.784235 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.784209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f34ad151e5f273367866fd58bbc327be-config\") pod \"kube-apiserver-proxy-ip-10-0-141-32.ec2.internal\" (UID: \"f34ad151e5f273367866fd58bbc327be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.881395 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.881311 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:02.884603 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.884581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f34ad151e5f273367866fd58bbc327be-config\") pod \"kube-apiserver-proxy-ip-10-0-141-32.ec2.internal\" (UID: \"f34ad151e5f273367866fd58bbc327be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.884720 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.884702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71ae8018aa9dd49bcac3f08161afc28d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal\" (UID: \"71ae8018aa9dd49bcac3f08161afc28d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.884780 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.884740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71ae8018aa9dd49bcac3f08161afc28d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal\" (UID: \"71ae8018aa9dd49bcac3f08161afc28d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.884780 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.884670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f34ad151e5f273367866fd58bbc327be-config\") pod \"kube-apiserver-proxy-ip-10-0-141-32.ec2.internal\" (UID: \"f34ad151e5f273367866fd58bbc327be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.884867 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.884781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71ae8018aa9dd49bcac3f08161afc28d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal\" (UID: \"71ae8018aa9dd49bcac3f08161afc28d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.884867 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:02.884787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/71ae8018aa9dd49bcac3f08161afc28d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal\" (UID: \"71ae8018aa9dd49bcac3f08161afc28d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:02.982072 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:02.982045 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.055502 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.055458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:03.059511 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.059482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" Apr 16 17:40:03.083151 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:03.083118 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.183688 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:03.183592 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.284137 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:03.284109 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.367593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.367557 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:40:03.368158 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.367714 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:40:03.368158 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.367751 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:40:03.384939 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:03.384888 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.457286 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.457207 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:03.462499 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.462457 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:35:02 +0000 UTC" deadline="2027-09-15 07:48:51.825910434 +0000 UTC" Apr 16 17:40:03.462499 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.462498 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12398h8m48.363415966s" Apr 16 17:40:03.481668 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.481642 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:03.485870 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:03.485849 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.496181 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.496161 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:03.517283 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.517254 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m4894" Apr 16 17:40:03.524517 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.524492 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m4894" Apr 16 17:40:03.586305 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:03.586254 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-32.ec2.internal\" not found" Apr 16 17:40:03.599255 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.599231 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:03.654349 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:03.654309 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ae8018aa9dd49bcac3f08161afc28d.slice/crio-8b578c3b83d34b5f4ebdbfd1d130b08608f7ebe4c7bc3f28eb0c86da563772e4 WatchSource:0}: Error finding container 8b578c3b83d34b5f4ebdbfd1d130b08608f7ebe4c7bc3f28eb0c86da563772e4: Status 404 returned error can't find the container with id 8b578c3b83d34b5f4ebdbfd1d130b08608f7ebe4c7bc3f28eb0c86da563772e4 Apr 16 17:40:03.654840 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:03.654816 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34ad151e5f273367866fd58bbc327be.slice/crio-d307d20bc85e5caf4b2877905db8242e988e148905ef1a1209c8cb2688bc6d20 WatchSource:0}: Error finding container d307d20bc85e5caf4b2877905db8242e988e148905ef1a1209c8cb2688bc6d20: Status 404 returned error can't find the container with id d307d20bc85e5caf4b2877905db8242e988e148905ef1a1209c8cb2688bc6d20 Apr 16 17:40:03.658736 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.658724 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:40:03.682225 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.682189 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" Apr 16 17:40:03.696124 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.696093 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:03.696982 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.696966 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" Apr 16 17:40:03.706787 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:03.706768 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:04.177071 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.177034 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:04.235244 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.235129 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:04.453249 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.453215 2576 apiserver.go:52] "Watching apiserver" Apr 16 17:40:04.460245 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.460218 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:40:04.462204 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.462172 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal","openshift-cluster-node-tuning-operator/tuned-ndgvz","openshift-image-registry/node-ca-cbg2b","openshift-multus/multus-8nf86","openshift-ovn-kubernetes/ovnkube-node-8kd57","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal","openshift-multus/multus-additional-cni-plugins-t42f2","openshift-multus/network-metrics-daemon-tw2xb","openshift-network-diagnostics/network-check-target-bs2mp","openshift-network-operator/iptables-alerter-28g5n","kube-system/konnectivity-agent-dbzll"] Apr 16 17:40:04.467187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.467158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.469456 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.469433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.469571 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.469512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.470042 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.470022 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:04.470126 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.470027 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5v4bz\"" Apr 16 17:40:04.470277 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.470256 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:04.472141 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.471869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.472141 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.472130 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:40:04.472598 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.472484 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:40:04.472598 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.472560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:40:04.472747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.472675 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:40:04.472747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.472714 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:40:04.472936 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.472902 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q85p4\"" Apr 16 17:40:04.473405 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.473289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:40:04.473405 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.473391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xlkxb\"" Apr 16 17:40:04.473520 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.473488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:40:04.474495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.474478 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:40:04.474824 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.474794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.475059 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.475019 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tknzl\"" Apr 16 17:40:04.475361 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.475272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:40:04.476036 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.476006 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:40:04.476277 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.476262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:40:04.476277 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.476272 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:40:04.476680 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.476650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:40:04.477324 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.477302 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:40:04.477415 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.477393 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:40:04.477485 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.477436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:40:04.477589 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.477566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mnbq7\"" Apr 16 17:40:04.482466 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.482446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.484929 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.484884 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hgp2z\"" Apr 16 17:40:04.485030 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.484981 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:04.485092 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.485068 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:04.485200 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.485179 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:40:04.485334 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.484892 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:40:04.487356 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.487324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:04.487427 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.487397 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:04.490730 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.490710 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.490843 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.490745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.493048 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-cnibin\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.493157 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-conf-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.493157 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-etc-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.493157 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-lib-modules\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.493157 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-var-lib-kubelet\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47qfx\" (UniqueName: \"kubernetes.io/projected/d10122cd-f300-4191-95af-3535482c3187-kube-api-access-47qfx\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-node-log\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493191 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d10122cd-f300-4191-95af-3535482c3187-cni-binary-copy\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h9tm8\"" Apr 16 17:40:04.493357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-socket-dir-parent\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-k8s-cni-cncf-io\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493375 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493376 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-var-lib-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77172d03-834d-4c9b-8b9c-2d2f57a663cd-host\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-socket-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-cni-bin\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-cni-netd\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.493658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7srdm\"" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493704 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-os-release\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d10122cd-f300-4191-95af-3535482c3187-multus-daemon-config\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-systemd\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysctl-d\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-run\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af59130-dcb7-4d75-a828-c42cc1333d3d-tmp\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.493987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-device-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-run-netns\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-systemd\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-registration-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.494172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frpkx\" (UniqueName: \"kubernetes.io/projected/b377c13c-6a96-47ac-be6e-6c19afe80cea-kube-api-access-frpkx\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-multus-certs\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-env-overrides\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77172d03-834d-4c9b-8b9c-2d2f57a663cd-serviceca\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-etc-kubernetes\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-log-socket\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22bp7\" (UniqueName: \"kubernetes.io/projected/404aadc7-59c7-4274-841e-38902a95c670-kube-api-access-22bp7\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysctl-conf\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ts9\" (UniqueName: \"kubernetes.io/projected/9af59130-dcb7-4d75-a828-c42cc1333d3d-kube-api-access-t5ts9\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-netns\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-kubelet\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-run-ovn-kubernetes\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-kubernetes\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-host\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.494898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-system-cni-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-cni-bin\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-hostroot\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-ovn\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3513544-fc2d-454a-86b1-8937a6fe9238-ovn-node-metrics-cert\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-cnibin\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-cni-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-kubelet\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-sys\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-sys-fs\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlm6w\" (UniqueName: \"kubernetes.io/projected/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-kube-api-access-mlm6w\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-tuned\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvfqm\" (UniqueName: \"kubernetes.io/projected/77172d03-834d-4c9b-8b9c-2d2f57a663cd-kube-api-access-vvfqm\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-system-cni-dir\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-slash\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.494975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.495497 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-cni-multus\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-systemd-units\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-ovnkube-config\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-ovnkube-script-lib\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqtqk\" (UniqueName: \"kubernetes.io/projected/a3513544-fc2d-454a-86b1-8937a6fe9238-kube-api-access-rqtqk\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-modprobe-d\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysconfig\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-etc-selinux\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-os-release\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.496152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.495220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-cni-binary-copy\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.525558 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.525523 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:03 +0000 UTC" deadline="2028-01-29 11:19:41.367244456 +0000 UTC" Apr 16 17:40:04.525558 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.525557 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15665h39m36.841690612s" Apr 16 17:40:04.583075 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.583044 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:40:04.596485 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysctl-conf\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.596485 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ts9\" (UniqueName: \"kubernetes.io/projected/9af59130-dcb7-4d75-a828-c42cc1333d3d-kube-api-access-t5ts9\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-netns\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-netns\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-kubelet\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-run-ovn-kubernetes\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb3b87bf-55de-44cd-a182-ab40925c246f-agent-certs\") pod \"konnectivity-agent-dbzll\" (UID: \"eb3b87bf-55de-44cd-a182-ab40925c246f\") " pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-kubelet\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.596714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysctl-conf\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-run-ovn-kubernetes\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-kubernetes\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-host\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-system-cni-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-cni-bin\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-hostroot\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-ovn\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-kubernetes\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-host\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3513544-fc2d-454a-86b1-8937a6fe9238-ovn-node-metrics-cert\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-ovn\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-cnibin\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-hostroot\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-cni-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-kubelet\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-cni-bin\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f72b7507-4253-4a04-ae02-afd105d65f75-host-slash\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.596887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-system-cni-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-cnibin\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-sys\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-sys\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-sys-fs\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-kubelet\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlm6w\" (UniqueName: \"kubernetes.io/projected/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-kube-api-access-mlm6w\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-tuned\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-cni-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-sys-fs\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvfqm\" (UniqueName: \"kubernetes.io/projected/77172d03-834d-4c9b-8b9c-2d2f57a663cd-kube-api-access-vvfqm\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597304 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-system-cni-dir\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-slash\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-system-cni-dir\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.597767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-cni-multus\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-slash\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-systemd-units\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-var-lib-cni-multus\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-ovnkube-config\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-systemd-units\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-ovnkube-script-lib\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqtqk\" (UniqueName: \"kubernetes.io/projected/a3513544-fc2d-454a-86b1-8937a6fe9238-kube-api-access-rqtqk\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-modprobe-d\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysconfig\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-etc-selinux\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-os-release\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-cni-binary-copy\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-cnibin\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-cnibin\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.597987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-os-release\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-etc-selinux\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.598689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysconfig\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-conf-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-etc-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-modprobe-d\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-lib-modules\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-var-lib-kubelet\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-ovnkube-config\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-conf-dir\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47qfx\" (UniqueName: \"kubernetes.io/projected/d10122cd-f300-4191-95af-3535482c3187-kube-api-access-47qfx\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-etc-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-var-lib-kubelet\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-node-log\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-lib-modules\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d10122cd-f300-4191-95af-3535482c3187-cni-binary-copy\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-cni-binary-copy\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.599477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-node-log\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-socket-dir-parent\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-k8s-cni-cncf-io\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-var-lib-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-multus-socket-dir-parent\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77172d03-834d-4c9b-8b9c-2d2f57a663cd-host\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-var-lib-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-socket-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-k8s-cni-cncf-io\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-cni-bin\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77172d03-834d-4c9b-8b9c-2d2f57a663cd-host\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-cni-bin\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-cni-netd\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-cni-netd\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:04.600265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb3b87bf-55de-44cd-a182-ab40925c246f-konnectivity-ca\") pod \"konnectivity-agent-dbzll\" (UID: \"eb3b87bf-55de-44cd-a182-ab40925c246f\") " pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-socket-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f72b7507-4253-4a04-ae02-afd105d65f75-iptables-alerter-script\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.598712 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-os-release\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d10122cd-f300-4191-95af-3535482c3187-multus-daemon-config\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.598829 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:05.098778489 +0000 UTC m=+3.115245989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-os-release\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-systemd\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d10122cd-f300-4191-95af-3535482c3187-cni-binary-copy\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysctl-d\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/404aadc7-59c7-4274-841e-38902a95c670-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.598973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-systemd\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-run\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af59130-dcb7-4d75-a828-c42cc1333d3d-tmp\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.600880 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-run\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-device-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-sysctl-d\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-run-netns\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-device-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-systemd\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d10122cd-f300-4191-95af-3535482c3187-multus-daemon-config\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-run-openvswitch\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-registration-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-host-run-netns\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-systemd\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frpkx\" (UniqueName: \"kubernetes.io/projected/b377c13c-6a96-47ac-be6e-6c19afe80cea-kube-api-access-frpkx\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b377c13c-6a96-47ac-be6e-6c19afe80cea-registration-dir\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-multus-certs\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-ovnkube-script-lib\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.601593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-env-overrides\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77172d03-834d-4c9b-8b9c-2d2f57a663cd-serviceca\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-host-run-multus-certs\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-etc-kubernetes\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-log-socket\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxq2\" (UniqueName: \"kubernetes.io/projected/f72b7507-4253-4a04-ae02-afd105d65f75-kube-api-access-pzxq2\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22bp7\" (UniqueName: \"kubernetes.io/projected/404aadc7-59c7-4274-841e-38902a95c670-kube-api-access-22bp7\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.599992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404aadc7-59c7-4274-841e-38902a95c670-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.600065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d10122cd-f300-4191-95af-3535482c3187-etc-kubernetes\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.600084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3513544-fc2d-454a-86b1-8937a6fe9238-env-overrides\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.600113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3513544-fc2d-454a-86b1-8937a6fe9238-log-socket\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.600478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77172d03-834d-4c9b-8b9c-2d2f57a663cd-serviceca\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.600996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9af59130-dcb7-4d75-a828-c42cc1333d3d-etc-tuned\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.601124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3513544-fc2d-454a-86b1-8937a6fe9238-ovn-node-metrics-cert\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.602211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.601959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af59130-dcb7-4d75-a828-c42cc1333d3d-tmp\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.610362 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.610333 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:04.610362 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.610363 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:04.610554 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.610377 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c6rr5 for pod openshift-network-diagnostics/network-check-target-bs2mp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:04.610554 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:04.610489 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5 podName:5a330027-b9ff-458e-aa78-e2eb5a0bda58 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:05.110470535 +0000 UTC m=+3.126938058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c6rr5" (UniqueName: "kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5") pod "network-check-target-bs2mp" (UID: "5a330027-b9ff-458e-aa78-e2eb5a0bda58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:04.612451 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.612426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frpkx\" (UniqueName: \"kubernetes.io/projected/b377c13c-6a96-47ac-be6e-6c19afe80cea-kube-api-access-frpkx\") pod \"aws-ebs-csi-driver-node-wckmp\" (UID: \"b377c13c-6a96-47ac-be6e-6c19afe80cea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.612451 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.612426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47qfx\" (UniqueName: \"kubernetes.io/projected/d10122cd-f300-4191-95af-3535482c3187-kube-api-access-47qfx\") pod \"multus-8nf86\" (UID: \"d10122cd-f300-4191-95af-3535482c3187\") " pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.612926 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.612889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqtqk\" (UniqueName: \"kubernetes.io/projected/a3513544-fc2d-454a-86b1-8937a6fe9238-kube-api-access-rqtqk\") pod \"ovnkube-node-8kd57\" (UID: \"a3513544-fc2d-454a-86b1-8937a6fe9238\") " pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.613020 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.612924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlm6w\" (UniqueName: \"kubernetes.io/projected/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-kube-api-access-mlm6w\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:04.613108 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.613082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvfqm\" (UniqueName: \"kubernetes.io/projected/77172d03-834d-4c9b-8b9c-2d2f57a663cd-kube-api-access-vvfqm\") pod \"node-ca-cbg2b\" (UID: \"77172d03-834d-4c9b-8b9c-2d2f57a663cd\") " pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.614000 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.613982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ts9\" (UniqueName: \"kubernetes.io/projected/9af59130-dcb7-4d75-a828-c42cc1333d3d-kube-api-access-t5ts9\") pod \"tuned-ndgvz\" (UID: \"9af59130-dcb7-4d75-a828-c42cc1333d3d\") " pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.614737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.614719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22bp7\" (UniqueName: \"kubernetes.io/projected/404aadc7-59c7-4274-841e-38902a95c670-kube-api-access-22bp7\") pod \"multus-additional-cni-plugins-t42f2\" (UID: \"404aadc7-59c7-4274-841e-38902a95c670\") " pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.626457 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.626399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" event={"ID":"71ae8018aa9dd49bcac3f08161afc28d","Type":"ContainerStarted","Data":"8b578c3b83d34b5f4ebdbfd1d130b08608f7ebe4c7bc3f28eb0c86da563772e4"} Apr 16 17:40:04.627353 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.627326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" event={"ID":"f34ad151e5f273367866fd58bbc327be","Type":"ContainerStarted","Data":"d307d20bc85e5caf4b2877905db8242e988e148905ef1a1209c8cb2688bc6d20"} Apr 16 17:40:04.701022 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.700982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb3b87bf-55de-44cd-a182-ab40925c246f-agent-certs\") pod \"konnectivity-agent-dbzll\" (UID: \"eb3b87bf-55de-44cd-a182-ab40925c246f\") " pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.701196 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.701062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f72b7507-4253-4a04-ae02-afd105d65f75-host-slash\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.701196 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.701142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb3b87bf-55de-44cd-a182-ab40925c246f-konnectivity-ca\") pod \"konnectivity-agent-dbzll\" (UID: \"eb3b87bf-55de-44cd-a182-ab40925c246f\") " pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.701196 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.701169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f72b7507-4253-4a04-ae02-afd105d65f75-iptables-alerter-script\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.701196 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.701187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f72b7507-4253-4a04-ae02-afd105d65f75-host-slash\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.701437 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.701241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxq2\" (UniqueName: \"kubernetes.io/projected/f72b7507-4253-4a04-ae02-afd105d65f75-kube-api-access-pzxq2\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.701762 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.701738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb3b87bf-55de-44cd-a182-ab40925c246f-konnectivity-ca\") pod \"konnectivity-agent-dbzll\" (UID: \"eb3b87bf-55de-44cd-a182-ab40925c246f\") " pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.702263 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.702237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f72b7507-4253-4a04-ae02-afd105d65f75-iptables-alerter-script\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.704206 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.704183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb3b87bf-55de-44cd-a182-ab40925c246f-agent-certs\") pod \"konnectivity-agent-dbzll\" (UID: \"eb3b87bf-55de-44cd-a182-ab40925c246f\") " pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.712640 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.712615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxq2\" (UniqueName: \"kubernetes.io/projected/f72b7507-4253-4a04-ae02-afd105d65f75-kube-api-access-pzxq2\") pod \"iptables-alerter-28g5n\" (UID: \"f72b7507-4253-4a04-ae02-afd105d65f75\") " pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:04.782606 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.782512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" Apr 16 17:40:04.790474 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.790445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cbg2b" Apr 16 17:40:04.805014 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.803307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8nf86" Apr 16 17:40:04.809183 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.809160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:04.814846 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.814818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" Apr 16 17:40:04.820564 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.820534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t42f2" Apr 16 17:40:04.826226 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.826206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:04.833283 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:04.833254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-28g5n" Apr 16 17:40:05.104649 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.104555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:05.104805 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.104740 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:05.104853 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.104820 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:06.104798488 +0000 UTC m=+4.121266006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:05.205230 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.205196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:05.205399 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.205373 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:05.205399 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.205394 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:05.205491 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.205404 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c6rr5 for pod openshift-network-diagnostics/network-check-target-bs2mp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:05.205491 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.205461 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5 podName:5a330027-b9ff-458e-aa78-e2eb5a0bda58 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:06.205441703 +0000 UTC m=+4.221909214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c6rr5" (UniqueName: "kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5") pod "network-check-target-bs2mp" (UID: "5a330027-b9ff-458e-aa78-e2eb5a0bda58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:05.371547 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:05.371512 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf72b7507_4253_4a04_ae02_afd105d65f75.slice/crio-c907bb02101cc942eb07e95c56da00d254546bc08334e0d6cb9f8e56f4d724ed WatchSource:0}: Error finding container c907bb02101cc942eb07e95c56da00d254546bc08334e0d6cb9f8e56f4d724ed: Status 404 returned error can't find the container with id c907bb02101cc942eb07e95c56da00d254546bc08334e0d6cb9f8e56f4d724ed Apr 16 17:40:05.372964 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:05.372851 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3513544_fc2d_454a_86b1_8937a6fe9238.slice/crio-f9972401554adb6cf538e134b4e679ef11790c6646d24a2820df98174b47f6fb WatchSource:0}: Error finding container f9972401554adb6cf538e134b4e679ef11790c6646d24a2820df98174b47f6fb: Status 404 returned error can't find the container with id f9972401554adb6cf538e134b4e679ef11790c6646d24a2820df98174b47f6fb Apr 16 17:40:05.373783 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:05.373760 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb377c13c_6a96_47ac_be6e_6c19afe80cea.slice/crio-14954f6fa6ce542c991b3dae803aa7772708944fd85c728804af139bca2c4e6c WatchSource:0}: Error finding container 14954f6fa6ce542c991b3dae803aa7772708944fd85c728804af139bca2c4e6c: Status 404 returned error can't find the container with id 14954f6fa6ce542c991b3dae803aa7772708944fd85c728804af139bca2c4e6c Apr 16 17:40:05.374428 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:05.374406 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd10122cd_f300_4191_95af_3535482c3187.slice/crio-4f4c1edb5006b3c720c681555175444349029583b8bb2804c970d22c8e555c6d WatchSource:0}: Error finding container 4f4c1edb5006b3c720c681555175444349029583b8bb2804c970d22c8e555c6d: Status 404 returned error can't find the container with id 4f4c1edb5006b3c720c681555175444349029583b8bb2804c970d22c8e555c6d Apr 16 17:40:05.378165 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:05.378141 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af59130_dcb7_4d75_a828_c42cc1333d3d.slice/crio-b891879825554872eb620afc1ace182de879feb262fcb4f0ec5d25532a9835c5 WatchSource:0}: Error finding container b891879825554872eb620afc1ace182de879feb262fcb4f0ec5d25532a9835c5: Status 404 returned error can't find the container with id b891879825554872eb620afc1ace182de879feb262fcb4f0ec5d25532a9835c5 Apr 16 17:40:05.526242 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.526051 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:03 +0000 UTC" deadline="2028-01-16 04:02:47.688468394 +0000 UTC" Apr 16 17:40:05.526242 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.526235 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15346h22m42.162236432s" Apr 16 17:40:05.622894 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.622791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:05.622894 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:05.622887 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:05.632247 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.632210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" event={"ID":"f34ad151e5f273367866fd58bbc327be","Type":"ContainerStarted","Data":"6da00c1e643684da9c0883ad3c4c0cbf765070441924104983fcfc47223d911c"} Apr 16 17:40:05.633259 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.633235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dbzll" event={"ID":"eb3b87bf-55de-44cd-a182-ab40925c246f","Type":"ContainerStarted","Data":"4027ff8b9443c01768ac40fde29cd31e89b92a70c50b849cbee1fd04e6d1edb7"} Apr 16 17:40:05.634284 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.634252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerStarted","Data":"fd98053d4706cd96782f85bb60f015d612980e2f7e78b285e7a68ff86ffd9886"} Apr 16 17:40:05.636086 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.636067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" event={"ID":"9af59130-dcb7-4d75-a828-c42cc1333d3d","Type":"ContainerStarted","Data":"b891879825554872eb620afc1ace182de879feb262fcb4f0ec5d25532a9835c5"} Apr 16 17:40:05.637455 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.637403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8nf86" event={"ID":"d10122cd-f300-4191-95af-3535482c3187","Type":"ContainerStarted","Data":"4f4c1edb5006b3c720c681555175444349029583b8bb2804c970d22c8e555c6d"} Apr 16 17:40:05.638738 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.638718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" event={"ID":"b377c13c-6a96-47ac-be6e-6c19afe80cea","Type":"ContainerStarted","Data":"14954f6fa6ce542c991b3dae803aa7772708944fd85c728804af139bca2c4e6c"} Apr 16 17:40:05.640870 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.640852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"f9972401554adb6cf538e134b4e679ef11790c6646d24a2820df98174b47f6fb"} Apr 16 17:40:05.641739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.641722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cbg2b" event={"ID":"77172d03-834d-4c9b-8b9c-2d2f57a663cd","Type":"ContainerStarted","Data":"097d815c18e66cb0241da4b75a5c1495230bcc9e126490629bff35f1e5a77a72"} Apr 16 17:40:05.642892 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.642871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-28g5n" event={"ID":"f72b7507-4253-4a04-ae02-afd105d65f75","Type":"ContainerStarted","Data":"c907bb02101cc942eb07e95c56da00d254546bc08334e0d6cb9f8e56f4d724ed"} Apr 16 17:40:05.646106 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:05.646071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-32.ec2.internal" podStartSLOduration=2.646061664 podStartE2EDuration="2.646061664s" podCreationTimestamp="2026-04-16 17:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:05.645957045 +0000 UTC m=+3.662424567" watchObservedRunningTime="2026-04-16 17:40:05.646061664 +0000 UTC m=+3.662529196" Apr 16 17:40:06.113380 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:06.113290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:06.113554 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.113534 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:06.113626 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.113599 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:08.113580186 +0000 UTC m=+6.130047687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:06.214518 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:06.214408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:06.214705 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.214564 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:06.214705 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.214586 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:06.214705 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.214598 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c6rr5 for pod openshift-network-diagnostics/network-check-target-bs2mp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:06.214705 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.214654 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5 podName:5a330027-b9ff-458e-aa78-e2eb5a0bda58 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:08.214637134 +0000 UTC m=+6.231104656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c6rr5" (UniqueName: "kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5") pod "network-check-target-bs2mp" (UID: "5a330027-b9ff-458e-aa78-e2eb5a0bda58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:06.625640 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:06.625604 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:06.626096 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:06.625748 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:06.652872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:06.651237 2576 generic.go:358] "Generic (PLEG): container finished" podID="71ae8018aa9dd49bcac3f08161afc28d" containerID="00672c29b9179c77092ecd5ffc1d16e85d8fa89f3a29590ca72d41e288f6755d" exitCode=0 Apr 16 17:40:06.652872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:06.652605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" event={"ID":"71ae8018aa9dd49bcac3f08161afc28d","Type":"ContainerDied","Data":"00672c29b9179c77092ecd5ffc1d16e85d8fa89f3a29590ca72d41e288f6755d"} Apr 16 17:40:07.622584 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:07.622550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:07.622768 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:07.622693 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:07.657332 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:07.657289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" event={"ID":"71ae8018aa9dd49bcac3f08161afc28d","Type":"ContainerStarted","Data":"ccf4c28a5ee0730fe50c055ea45f0a4c40991e3ddf354abee4d0ab8c006f6d86"} Apr 16 17:40:08.134614 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:08.134529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:08.134811 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.134716 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.134811 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.134782 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:12.134763287 +0000 UTC m=+10.151230803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.235533 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:08.235478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:08.235749 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.235640 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:08.235749 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.235658 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:08.235749 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.235669 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c6rr5 for pod openshift-network-diagnostics/network-check-target-bs2mp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:08.235749 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.235727 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5 podName:5a330027-b9ff-458e-aa78-e2eb5a0bda58 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:12.235708718 +0000 UTC m=+10.252176217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c6rr5" (UniqueName: "kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5") pod "network-check-target-bs2mp" (UID: "5a330027-b9ff-458e-aa78-e2eb5a0bda58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:08.622455 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:08.622419 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:08.622633 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:08.622573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:09.622685 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:09.622586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:09.623135 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:09.622743 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:10.623330 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:10.623291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:10.623773 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:10.623437 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:11.623056 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:11.623019 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:11.623256 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:11.623162 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:12.171366 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:12.171324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:12.171827 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.171500 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:12.171827 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.171587 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:20.171564413 +0000 UTC m=+18.188031914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:12.272153 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:12.272112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:12.272340 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.272297 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:12.272340 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.272318 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:12.272340 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.272332 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c6rr5 for pod openshift-network-diagnostics/network-check-target-bs2mp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:12.272487 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.272400 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5 podName:5a330027-b9ff-458e-aa78-e2eb5a0bda58 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:20.272380887 +0000 UTC m=+18.288848389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c6rr5" (UniqueName: "kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5") pod "network-check-target-bs2mp" (UID: "5a330027-b9ff-458e-aa78-e2eb5a0bda58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:12.624194 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:12.624098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:12.624356 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:12.624226 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:13.622330 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:13.622300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:13.622735 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:13.622412 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:14.623246 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:14.623157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:14.623625 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:14.623313 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:15.622263 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:15.622217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:15.622481 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:15.622348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:16.623160 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:16.623123 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:16.623604 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:16.623249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:17.622369 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:17.622335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:17.622528 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:17.622437 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:18.622276 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:18.622240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:18.622786 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:18.622393 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:19.622420 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:19.622389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:19.622850 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:19.622523 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:20.230133 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:20.230096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:20.230285 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.230262 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:20.230344 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.230333 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.230313911 +0000 UTC m=+34.246781413 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:20.330935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:20.330887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:20.331112 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.331039 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:20.331112 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.331058 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:20.331112 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.331068 2576 projected.go:194] Error preparing data for projected volume kube-api-access-c6rr5 for pod openshift-network-diagnostics/network-check-target-bs2mp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:20.331229 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.331122 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5 podName:5a330027-b9ff-458e-aa78-e2eb5a0bda58 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.331103938 +0000 UTC m=+34.347571460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c6rr5" (UniqueName: "kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5") pod "network-check-target-bs2mp" (UID: "5a330027-b9ff-458e-aa78-e2eb5a0bda58") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:20.622903 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:20.622821 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:20.623339 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:20.623012 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:21.623223 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:21.623178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:21.623660 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:21.623332 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:22.623046 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.622813 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:22.623210 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:22.623187 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:22.691170 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.691137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dbzll" event={"ID":"eb3b87bf-55de-44cd-a182-ab40925c246f","Type":"ContainerStarted","Data":"bfc8ab2936b3b3f77a8af8f43b74431f81a23e726595cf45a6ea46911474047e"} Apr 16 17:40:22.692367 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.692344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerStarted","Data":"375ed7f84b75fc33dcb3b86f96c2e38c582a717a3b757563c66d223a4e128a14"} Apr 16 17:40:22.693467 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.693447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" event={"ID":"9af59130-dcb7-4d75-a828-c42cc1333d3d","Type":"ContainerStarted","Data":"3b3297403c322c5935206b67305ab806053f090a86936ba44836cfda979b85a4"} Apr 16 17:40:22.694619 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.694601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8nf86" event={"ID":"d10122cd-f300-4191-95af-3535482c3187","Type":"ContainerStarted","Data":"93f904702bf28cbe739316af4104532700deea6c98078499cf53d5e1b132dd06"} Apr 16 17:40:22.696418 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.696400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" event={"ID":"b377c13c-6a96-47ac-be6e-6c19afe80cea","Type":"ContainerStarted","Data":"1c0367cd3db13b7529969659b63ed3223c0ee1715fbadf6fc91e76d97026eede"} Apr 16 17:40:22.697695 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.697676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cbg2b" event={"ID":"77172d03-834d-4c9b-8b9c-2d2f57a663cd","Type":"ContainerStarted","Data":"de9740c9c88f0397084c4a44dfc9170ea52a81189fac85f7e7036764a6a09dc5"} Apr 16 17:40:22.712417 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.712376 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-32.ec2.internal" podStartSLOduration=19.712366793 podStartE2EDuration="19.712366793s" podCreationTimestamp="2026-04-16 17:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:07.674920707 +0000 UTC m=+5.691388223" watchObservedRunningTime="2026-04-16 17:40:22.712366793 +0000 UTC m=+20.728834311" Apr 16 17:40:22.712614 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.712593 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dbzll" podStartSLOduration=3.774401082 podStartE2EDuration="20.712587528s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.382968066 +0000 UTC m=+3.399435566" lastFinishedPulling="2026-04-16 17:40:22.321154496 +0000 UTC m=+20.337622012" observedRunningTime="2026-04-16 17:40:22.712113352 +0000 UTC m=+20.728580874" watchObservedRunningTime="2026-04-16 17:40:22.712587528 +0000 UTC m=+20.729055060" Apr 16 17:40:22.735132 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.732971 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ndgvz" podStartSLOduration=3.819315624 podStartE2EDuration="20.732954818s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.380438779 +0000 UTC m=+3.396906281" lastFinishedPulling="2026-04-16 17:40:22.294077966 +0000 UTC m=+20.310545475" observedRunningTime="2026-04-16 17:40:22.731848631 +0000 UTC m=+20.748316151" watchObservedRunningTime="2026-04-16 17:40:22.732954818 +0000 UTC m=+20.749422341" Apr 16 17:40:22.747761 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.747712 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cbg2b" podStartSLOduration=3.809867702 podStartE2EDuration="20.747697699s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.383323406 +0000 UTC m=+3.399790905" lastFinishedPulling="2026-04-16 17:40:22.321153398 +0000 UTC m=+20.337620902" observedRunningTime="2026-04-16 17:40:22.747297683 +0000 UTC m=+20.763765205" watchObservedRunningTime="2026-04-16 17:40:22.747697699 +0000 UTC m=+20.764165231" Apr 16 17:40:22.767125 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:22.767082 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8nf86" podStartSLOduration=3.7919418289999998 podStartE2EDuration="20.767067258s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.377591154 +0000 UTC m=+3.394058667" lastFinishedPulling="2026-04-16 17:40:22.352716583 +0000 UTC m=+20.369184096" observedRunningTime="2026-04-16 17:40:22.767003142 +0000 UTC m=+20.783470663" watchObservedRunningTime="2026-04-16 17:40:22.767067258 +0000 UTC m=+20.783534816" Apr 16 17:40:23.443826 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.443652 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:40:23.563764 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.563580 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:40:23.443807453Z","UUID":"39db66c1-4b82-40cf-8af6-3e01933e4d20","Handler":null,"Name":"","Endpoint":""} Apr 16 17:40:23.565125 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.565105 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:40:23.565230 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.565131 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:40:23.623057 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.623024 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:23.623195 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:23.623128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:23.700413 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.700378 2576 generic.go:358] "Generic (PLEG): container finished" podID="404aadc7-59c7-4274-841e-38902a95c670" containerID="375ed7f84b75fc33dcb3b86f96c2e38c582a717a3b757563c66d223a4e128a14" exitCode=0 Apr 16 17:40:23.701205 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.700465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerDied","Data":"375ed7f84b75fc33dcb3b86f96c2e38c582a717a3b757563c66d223a4e128a14"} Apr 16 17:40:23.702106 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.702084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" event={"ID":"b377c13c-6a96-47ac-be6e-6c19afe80cea","Type":"ContainerStarted","Data":"43bfcb2549979ef141a998e7f3e330df8f4e27ff60fe94092c64199967701a22"} Apr 16 17:40:23.704699 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.704668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"257550747a784674b35efac7ba707f9540bdb891935318a8aec423d1e3fb798e"} Apr 16 17:40:23.704791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.704729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"2a6e5ffd7937205fcab1f65ae3f56f447418de09e71924cc59993fe5e0fb335a"} Apr 16 17:40:23.704791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.704775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"7c6731f28c6db7beb9b93eea78dac29a1f02cb0e63eb5de9ed566e03ba1f625d"} Apr 16 17:40:23.704877 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.704790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"f0be1cf64ccb6aeab983dd39d13f4e2a14d44caebac78a8b0b04e567c8978f3e"} Apr 16 17:40:23.704877 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.704804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"5c4b3174399e49afd8741b1473fbd4325b16a0ed56206a66773d5ae3b4d2ea5b"} Apr 16 17:40:23.704877 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.704830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"016619aae5c1eeedc47c5434127f7e900493844cb62a88b09807d0717d0aa1fe"} Apr 16 17:40:23.706983 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.706951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-28g5n" event={"ID":"f72b7507-4253-4a04-ae02-afd105d65f75","Type":"ContainerStarted","Data":"1589543d676408a68b2e3f98b8a4f5de03cb3d51eeac921aeb5644b91725f4ca"} Apr 16 17:40:23.751663 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:23.751617 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-28g5n" podStartSLOduration=4.8307373810000005 podStartE2EDuration="21.751601689s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.373214488 +0000 UTC m=+3.389681987" lastFinishedPulling="2026-04-16 17:40:22.29407879 +0000 UTC m=+20.310546295" observedRunningTime="2026-04-16 17:40:23.751570446 +0000 UTC m=+21.768037963" watchObservedRunningTime="2026-04-16 17:40:23.751601689 +0000 UTC m=+21.768069204" Apr 16 17:40:24.135355 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.135321 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p4dbv"] Apr 16 17:40:24.140364 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.140334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.142856 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.142832 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jcfh8\"" Apr 16 17:40:24.143214 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.143190 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:40:24.143420 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.143403 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:40:24.257236 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.257194 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/559281b5-6292-4642-95ef-022daeacb46b-tmp-dir\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.257414 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.257323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/559281b5-6292-4642-95ef-022daeacb46b-hosts-file\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.257414 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.257365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drn92\" (UniqueName: \"kubernetes.io/projected/559281b5-6292-4642-95ef-022daeacb46b-kube-api-access-drn92\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.357988 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.357954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/559281b5-6292-4642-95ef-022daeacb46b-tmp-dir\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.358146 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.358028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/559281b5-6292-4642-95ef-022daeacb46b-hosts-file\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.358146 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.358050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drn92\" (UniqueName: \"kubernetes.io/projected/559281b5-6292-4642-95ef-022daeacb46b-kube-api-access-drn92\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.358229 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.358141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/559281b5-6292-4642-95ef-022daeacb46b-hosts-file\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.358361 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.358339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/559281b5-6292-4642-95ef-022daeacb46b-tmp-dir\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.370574 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.370543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drn92\" (UniqueName: \"kubernetes.io/projected/559281b5-6292-4642-95ef-022daeacb46b-kube-api-access-drn92\") pod \"node-resolver-p4dbv\" (UID: \"559281b5-6292-4642-95ef-022daeacb46b\") " pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.451732 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.451698 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p4dbv" Apr 16 17:40:24.460158 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:24.460123 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod559281b5_6292_4642_95ef_022daeacb46b.slice/crio-22e228f264c973ef24f4ca7f015dc889843a5125c817e4d2b36fa7f3d6132b66 WatchSource:0}: Error finding container 22e228f264c973ef24f4ca7f015dc889843a5125c817e4d2b36fa7f3d6132b66: Status 404 returned error can't find the container with id 22e228f264c973ef24f4ca7f015dc889843a5125c817e4d2b36fa7f3d6132b66 Apr 16 17:40:24.623329 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.623141 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:24.623500 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:24.623437 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:24.711136 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.711055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" event={"ID":"b377c13c-6a96-47ac-be6e-6c19afe80cea","Type":"ContainerStarted","Data":"2da3ef9aab6e78865209ea697224c16a52011644bff63d65e56884e146ba2359"} Apr 16 17:40:24.712465 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.712414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p4dbv" event={"ID":"559281b5-6292-4642-95ef-022daeacb46b","Type":"ContainerStarted","Data":"f6fc2097c6cb67c48b8f1c712bf1b105bb62396d6307e1361e64bdd2fb150a8a"} Apr 16 17:40:24.712657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.712481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p4dbv" event={"ID":"559281b5-6292-4642-95ef-022daeacb46b","Type":"ContainerStarted","Data":"22e228f264c973ef24f4ca7f015dc889843a5125c817e4d2b36fa7f3d6132b66"} Apr 16 17:40:24.756689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.756630 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p4dbv" podStartSLOduration=0.75660995 podStartE2EDuration="756.60995ms" podCreationTimestamp="2026-04-16 17:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:24.756575612 +0000 UTC m=+22.773043144" watchObservedRunningTime="2026-04-16 17:40:24.75660995 +0000 UTC m=+22.773077466" Apr 16 17:40:24.757301 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:24.757260 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wckmp" podStartSLOduration=3.92011992 podStartE2EDuration="22.757246691s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.37625496 +0000 UTC m=+3.392722461" lastFinishedPulling="2026-04-16 17:40:24.21338172 +0000 UTC m=+22.229849232" observedRunningTime="2026-04-16 17:40:24.735318394 +0000 UTC m=+22.751785916" watchObservedRunningTime="2026-04-16 17:40:24.757246691 +0000 UTC m=+22.773714212" Apr 16 17:40:25.622787 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:25.622741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:25.622972 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:25.622882 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:25.718638 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:25.718537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"34b17810793e1de6275fffef2948cb1086b2fc3471272fc8d4ee83770864b431"} Apr 16 17:40:26.622838 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:26.622796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:26.623039 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:26.622949 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:26.993191 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:26.993150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:26.993991 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:26.993972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:27.622979 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.622756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:27.623136 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:27.623094 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:27.640513 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.640486 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hxk67"] Apr 16 17:40:27.660510 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.660425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.660699 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:27.660526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxk67" podUID="908ddae4-92b4-4772-acb3-f93ed0f019ea" Apr 16 17:40:27.724551 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.724495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerStarted","Data":"3d3a55b59c6fa707c24926460a4e6d94f1416f2a35e4254af30988181d686cec"} Apr 16 17:40:27.728070 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.728032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" event={"ID":"a3513544-fc2d-454a-86b1-8937a6fe9238","Type":"ContainerStarted","Data":"33fb71b6e27807609c0e95c7965687e07597afa63532d7a987255964762a1a8e"} Apr 16 17:40:27.728328 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.728308 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:27.729010 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.728989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dbzll" Apr 16 17:40:27.786459 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.786431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/908ddae4-92b4-4772-acb3-f93ed0f019ea-kubelet-config\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.786590 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.786463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/908ddae4-92b4-4772-acb3-f93ed0f019ea-dbus\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.786590 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.786484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.796647 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.796584 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" podStartSLOduration=8.36721521 podStartE2EDuration="25.796566042s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.376316444 +0000 UTC m=+3.392783947" lastFinishedPulling="2026-04-16 17:40:22.805667264 +0000 UTC m=+20.822134779" observedRunningTime="2026-04-16 17:40:27.795448773 +0000 UTC m=+25.811916297" watchObservedRunningTime="2026-04-16 17:40:27.796566042 +0000 UTC m=+25.813033563" Apr 16 17:40:27.888226 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.887894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/908ddae4-92b4-4772-acb3-f93ed0f019ea-kubelet-config\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.888226 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.887976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/908ddae4-92b4-4772-acb3-f93ed0f019ea-dbus\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.888226 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.888022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.888226 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:27.888141 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:27.888226 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.888167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/908ddae4-92b4-4772-acb3-f93ed0f019ea-kubelet-config\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:27.888226 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:27.888222 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret podName:908ddae4-92b4-4772-acb3-f93ed0f019ea nodeName:}" failed. No retries permitted until 2026-04-16 17:40:28.38819964 +0000 UTC m=+26.404667150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret") pod "global-pull-secret-syncer-hxk67" (UID: "908ddae4-92b4-4772-acb3-f93ed0f019ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:27.888561 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:27.888248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/908ddae4-92b4-4772-acb3-f93ed0f019ea-dbus\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:28.392434 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.392398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:28.392993 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:28.392534 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:28.392993 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:28.392598 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret podName:908ddae4-92b4-4772-acb3-f93ed0f019ea nodeName:}" failed. No retries permitted until 2026-04-16 17:40:29.392580581 +0000 UTC m=+27.409048080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret") pod "global-pull-secret-syncer-hxk67" (UID: "908ddae4-92b4-4772-acb3-f93ed0f019ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:28.625526 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.625443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:28.625661 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:28.625550 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:28.731123 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.731091 2576 generic.go:358] "Generic (PLEG): container finished" podID="404aadc7-59c7-4274-841e-38902a95c670" containerID="3d3a55b59c6fa707c24926460a4e6d94f1416f2a35e4254af30988181d686cec" exitCode=0 Apr 16 17:40:28.731278 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.731191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerDied","Data":"3d3a55b59c6fa707c24926460a4e6d94f1416f2a35e4254af30988181d686cec"} Apr 16 17:40:28.732569 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.731879 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:28.732569 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.731903 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:28.732569 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.731934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:28.747375 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.747350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:28.747491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:28.747417 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:40:29.400695 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.400465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:29.401032 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:29.400604 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:29.401032 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:29.400752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret podName:908ddae4-92b4-4772-acb3-f93ed0f019ea nodeName:}" failed. No retries permitted until 2026-04-16 17:40:31.400737938 +0000 UTC m=+29.417205437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret") pod "global-pull-secret-syncer-hxk67" (UID: "908ddae4-92b4-4772-acb3-f93ed0f019ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:29.536252 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.536218 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tw2xb"] Apr 16 17:40:29.536397 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.536347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:29.536493 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:29.536471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:29.539350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.539281 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bs2mp"] Apr 16 17:40:29.539456 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.539403 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:29.539518 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:29.539501 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:29.539811 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.539786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxk67"] Apr 16 17:40:29.539899 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.539886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:29.540005 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:29.539983 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxk67" podUID="908ddae4-92b4-4772-acb3-f93ed0f019ea" Apr 16 17:40:29.735333 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.735303 2576 generic.go:358] "Generic (PLEG): container finished" podID="404aadc7-59c7-4274-841e-38902a95c670" containerID="2b75fcb8ed5b8273ce737549b38cc2bd5ea60a77dc3befc4255348d88302379b" exitCode=0 Apr 16 17:40:29.735493 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:29.735382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerDied","Data":"2b75fcb8ed5b8273ce737549b38cc2bd5ea60a77dc3befc4255348d88302379b"} Apr 16 17:40:30.739790 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:30.739741 2576 generic.go:358] "Generic (PLEG): container finished" podID="404aadc7-59c7-4274-841e-38902a95c670" containerID="0670863e26a9b375cb856d5a145efd3fa50ee78b241cac9b7956b8c1c16c47e4" exitCode=0 Apr 16 17:40:30.740282 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:30.739824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerDied","Data":"0670863e26a9b375cb856d5a145efd3fa50ee78b241cac9b7956b8c1c16c47e4"} Apr 16 17:40:31.417625 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:31.417586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:31.417812 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:31.417714 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:31.417812 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:31.417768 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret podName:908ddae4-92b4-4772-acb3-f93ed0f019ea nodeName:}" failed. No retries permitted until 2026-04-16 17:40:35.4177548 +0000 UTC m=+33.434222299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret") pod "global-pull-secret-syncer-hxk67" (UID: "908ddae4-92b4-4772-acb3-f93ed0f019ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:31.622579 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:31.622543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:31.622753 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:31.622543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:31.622753 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:31.622677 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:31.622878 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:31.622768 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxk67" podUID="908ddae4-92b4-4772-acb3-f93ed0f019ea" Apr 16 17:40:31.622878 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:31.622554 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:31.622984 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:31.622892 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:33.622304 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:33.622216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:33.622304 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:33.622267 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:33.622985 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:33.622216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:33.622985 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:33.622353 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tw2xb" podUID="98ed775b-36f2-475e-9c1b-e1e3a5261ed5" Apr 16 17:40:33.622985 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:33.622444 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxk67" podUID="908ddae4-92b4-4772-acb3-f93ed0f019ea" Apr 16 17:40:33.622985 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:33.622520 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bs2mp" podUID="5a330027-b9ff-458e-aa78-e2eb5a0bda58" Apr 16 17:40:35.263214 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.262938 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-32.ec2.internal" event="NodeReady" Apr 16 17:40:35.263615 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.263255 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:40:35.300783 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.300749 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57577fdccc-fltds"] Apr 16 17:40:35.330984 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.330947 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vszkl"] Apr 16 17:40:35.331157 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.331049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.333773 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.333711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 17:40:35.333773 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.333760 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n5znr\"" Apr 16 17:40:35.334042 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.334009 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 17:40:35.334130 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.334068 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 17:40:35.339107 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.339082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 17:40:35.349243 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.349211 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pc75n"] Apr 16 17:40:35.349395 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.349374 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.352246 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.352222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:40:35.352387 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.352265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:40:35.352387 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.352279 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qhzh2\"" Apr 16 17:40:35.361156 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.361136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57577fdccc-fltds"] Apr 16 17:40:35.361276 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.361162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vszkl"] Apr 16 17:40:35.361276 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.361176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pc75n"] Apr 16 17:40:35.361360 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.361280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:35.363777 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.363755 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:40:35.363890 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.363800 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zdjnc\"" Apr 16 17:40:35.363890 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.363753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:40:35.363890 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.363831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:40:35.447279 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/643af40e-aabc-4fb6-8e21-7926a029b0a0-tmp-dir\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.447466 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhrc\" (UniqueName: \"kubernetes.io/projected/643af40e-aabc-4fb6-8e21-7926a029b0a0-kube-api-access-8lhrc\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.447466 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-installation-pull-secrets\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447466 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:35.447636 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-certificates\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447636 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-trusted-ca\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447636 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.447466 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:35.447636 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:35.447636 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-bound-sa-token\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447636 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.447611 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret podName:908ddae4-92b4-4772-acb3-f93ed0f019ea nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.447592664 +0000 UTC m=+41.464060178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret") pod "global-pull-secret-syncer-hxk67" (UID: "908ddae4-92b4-4772-acb3-f93ed0f019ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2gpm\" (UniqueName: \"kubernetes.io/projected/3bf148cf-abbf-4345-8487-fd40ceff855c-kube-api-access-f2gpm\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-image-registry-private-configuration\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47fjl\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-kube-api-access-47fjl\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3c3daf4-9be0-4443-af47-0a67097365fa-ca-trust-extracted\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.447935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.447822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/643af40e-aabc-4fb6-8e21-7926a029b0a0-config-volume\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.548655 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-installation-pull-secrets\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.548655 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-certificates\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.548655 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-trusted-ca\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.548935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:35.548935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-bound-sa-token\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.548935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2gpm\" (UniqueName: \"kubernetes.io/projected/3bf148cf-abbf-4345-8487-fd40ceff855c-kube-api-access-f2gpm\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:35.548935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.548935 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.548894 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.548965 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.548989 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert podName:3bf148cf-abbf-4345-8487-fd40ceff855c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.048968472 +0000 UTC m=+34.065435972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert") pod "ingress-canary-pc75n" (UID: "3bf148cf-abbf-4345-8487-fd40ceff855c") : secret "canary-serving-cert" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.549017 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls podName:643af40e-aabc-4fb6-8e21-7926a029b0a0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.049002355 +0000 UTC m=+34.065469867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls") pod "dns-default-vszkl" (UID: "643af40e-aabc-4fb6-8e21-7926a029b0a0") : secret "dns-default-metrics-tls" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.549068 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.549078 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57577fdccc-fltds: secret "image-registry-tls" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.548897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:35.549106 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls podName:a3c3daf4-9be0-4443-af47-0a67097365fa nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.049097207 +0000 UTC m=+34.065564739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls") pod "image-registry-57577fdccc-fltds" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa") : secret "image-registry-tls" not found Apr 16 17:40:35.549183 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-image-registry-private-configuration\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.549600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47fjl\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-kube-api-access-47fjl\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.549600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3c3daf4-9be0-4443-af47-0a67097365fa-ca-trust-extracted\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.549600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-certificates\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.549600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/643af40e-aabc-4fb6-8e21-7926a029b0a0-config-volume\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.549600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/643af40e-aabc-4fb6-8e21-7926a029b0a0-tmp-dir\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.549600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhrc\" (UniqueName: \"kubernetes.io/projected/643af40e-aabc-4fb6-8e21-7926a029b0a0-kube-api-access-8lhrc\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.549823 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-trusted-ca\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.549994 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.549970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/643af40e-aabc-4fb6-8e21-7926a029b0a0-config-volume\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.557594 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.557545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3c3daf4-9be0-4443-af47-0a67097365fa-ca-trust-extracted\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.557594 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.557558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/643af40e-aabc-4fb6-8e21-7926a029b0a0-tmp-dir\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.557899 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.557876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-image-registry-private-configuration\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.558139 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.558115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-installation-pull-secrets\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.566654 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.566627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhrc\" (UniqueName: \"kubernetes.io/projected/643af40e-aabc-4fb6-8e21-7926a029b0a0-kube-api-access-8lhrc\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:35.567067 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.567047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-bound-sa-token\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.567067 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.567060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2gpm\" (UniqueName: \"kubernetes.io/projected/3bf148cf-abbf-4345-8487-fd40ceff855c-kube-api-access-f2gpm\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:35.567553 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.567531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47fjl\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-kube-api-access-47fjl\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:35.622239 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.622199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:35.622239 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.622221 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:35.622482 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.622200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:35.625073 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.625033 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:40:35.625073 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.625058 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:40:35.625073 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.625071 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:40:35.625282 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.625232 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdxwt\"" Apr 16 17:40:35.625394 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.625378 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:40:35.625438 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:35.625426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvrx4\"" Apr 16 17:40:36.054130 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.054092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.054159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.054190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054224 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054297 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert podName:3bf148cf-abbf-4345-8487-fd40ceff855c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:37.054277051 +0000 UTC m=+35.070744622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert") pod "ingress-canary-pc75n" (UID: "3bf148cf-abbf-4345-8487-fd40ceff855c") : secret "canary-serving-cert" not found Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054299 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054314 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57577fdccc-fltds: secret "image-registry-tls" not found Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054324 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:36.054387 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054370 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls podName:a3c3daf4-9be0-4443-af47-0a67097365fa nodeName:}" failed. No retries permitted until 2026-04-16 17:40:37.054353465 +0000 UTC m=+35.070820973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls") pod "image-registry-57577fdccc-fltds" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa") : secret "image-registry-tls" not found Apr 16 17:40:36.054712 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.054400 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls podName:643af40e-aabc-4fb6-8e21-7926a029b0a0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:37.054385706 +0000 UTC m=+35.070853225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls") pod "dns-default-vszkl" (UID: "643af40e-aabc-4fb6-8e21-7926a029b0a0") : secret "dns-default-metrics-tls" not found Apr 16 17:40:36.256511 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.256471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:40:36.256695 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.256629 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:40:36.256751 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:36.256714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs podName:98ed775b-36f2-475e-9c1b-e1e3a5261ed5 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:08.256698299 +0000 UTC m=+66.273165816 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs") pod "network-metrics-daemon-tw2xb" (UID: "98ed775b-36f2-475e-9c1b-e1e3a5261ed5") : secret "metrics-daemon-secret" not found Apr 16 17:40:36.356949 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.356839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:36.359287 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.359267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rr5\" (UniqueName: \"kubernetes.io/projected/5a330027-b9ff-458e-aa78-e2eb5a0bda58-kube-api-access-c6rr5\") pod \"network-check-target-bs2mp\" (UID: \"5a330027-b9ff-458e-aa78-e2eb5a0bda58\") " pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:36.546170 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.546135 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:36.719954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.719923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bs2mp"] Apr 16 17:40:36.726994 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:36.726959 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a330027_b9ff_458e_aa78_e2eb5a0bda58.slice/crio-fe3eb098e20ea4ad3b8e8b685c4c8df3c32812ae0f86d3c2df39baac09839712 WatchSource:0}: Error finding container fe3eb098e20ea4ad3b8e8b685c4c8df3c32812ae0f86d3c2df39baac09839712: Status 404 returned error can't find the container with id fe3eb098e20ea4ad3b8e8b685c4c8df3c32812ae0f86d3c2df39baac09839712 Apr 16 17:40:36.751866 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:36.751836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bs2mp" event={"ID":"5a330027-b9ff-458e-aa78-e2eb5a0bda58","Type":"ContainerStarted","Data":"fe3eb098e20ea4ad3b8e8b685c4c8df3c32812ae0f86d3c2df39baac09839712"} Apr 16 17:40:37.062203 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.062170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:37.062363 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.062222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:37.062363 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.062240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:37.062363 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062298 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:37.062363 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062319 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:37.062363 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062335 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:37.062363 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062351 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57577fdccc-fltds: secret "image-registry-tls" not found Apr 16 17:40:37.062762 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062372 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert podName:3bf148cf-abbf-4345-8487-fd40ceff855c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.062354975 +0000 UTC m=+37.078822478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert") pod "ingress-canary-pc75n" (UID: "3bf148cf-abbf-4345-8487-fd40ceff855c") : secret "canary-serving-cert" not found Apr 16 17:40:37.062762 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062390 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls podName:a3c3daf4-9be0-4443-af47-0a67097365fa nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.062383882 +0000 UTC m=+37.078851382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls") pod "image-registry-57577fdccc-fltds" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa") : secret "image-registry-tls" not found Apr 16 17:40:37.062762 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:37.062400 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls podName:643af40e-aabc-4fb6-8e21-7926a029b0a0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.062394924 +0000 UTC m=+37.078862422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls") pod "dns-default-vszkl" (UID: "643af40e-aabc-4fb6-8e21-7926a029b0a0") : secret "dns-default-metrics-tls" not found Apr 16 17:40:37.587511 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.587477 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw"] Apr 16 17:40:37.613480 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.613438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw"] Apr 16 17:40:37.613636 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.613603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" Apr 16 17:40:37.616500 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.616480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-fc2cr\"" Apr 16 17:40:37.616646 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.616480 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 17:40:37.617409 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.617386 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:37.757353 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.757303 2576 generic.go:358] "Generic (PLEG): container finished" podID="404aadc7-59c7-4274-841e-38902a95c670" containerID="4ab16069cbf0e3e050142a9132e486367b461daa4f6daedfd2528e0c69da8091" exitCode=0 Apr 16 17:40:37.757538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.757390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerDied","Data":"4ab16069cbf0e3e050142a9132e486367b461daa4f6daedfd2528e0c69da8091"} Apr 16 17:40:37.768604 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.768576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qf6\" (UniqueName: \"kubernetes.io/projected/d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af-kube-api-access-d9qf6\") pod \"migrator-64d4d94569-5vrzw\" (UID: \"d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" Apr 16 17:40:37.870080 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.869996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qf6\" (UniqueName: \"kubernetes.io/projected/d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af-kube-api-access-d9qf6\") pod \"migrator-64d4d94569-5vrzw\" (UID: \"d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" Apr 16 17:40:37.883185 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.883152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qf6\" (UniqueName: \"kubernetes.io/projected/d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af-kube-api-access-d9qf6\") pod \"migrator-64d4d94569-5vrzw\" (UID: \"d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" Apr 16 17:40:37.924600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:37.924568 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" Apr 16 17:40:38.067574 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:38.067537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw"] Apr 16 17:40:38.070633 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:38.070602 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6d3f0ae_15c7_4a35_80c9_6fae8ac4f2af.slice/crio-634e9bb9111a4f60fddff252f8761ba88a645184ab3965d9d000757aedc18dd6 WatchSource:0}: Error finding container 634e9bb9111a4f60fddff252f8761ba88a645184ab3965d9d000757aedc18dd6: Status 404 returned error can't find the container with id 634e9bb9111a4f60fddff252f8761ba88a645184ab3965d9d000757aedc18dd6 Apr 16 17:40:38.763710 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:38.763663 2576 generic.go:358] "Generic (PLEG): container finished" podID="404aadc7-59c7-4274-841e-38902a95c670" containerID="c098a8cc94862297f236575a213f72fa6d376ac7ea5c7f3581899a214bc001e2" exitCode=0 Apr 16 17:40:38.764373 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:38.763748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerDied","Data":"c098a8cc94862297f236575a213f72fa6d376ac7ea5c7f3581899a214bc001e2"} Apr 16 17:40:38.765095 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:38.765040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" event={"ID":"d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af","Type":"ContainerStarted","Data":"634e9bb9111a4f60fddff252f8761ba88a645184ab3965d9d000757aedc18dd6"} Apr 16 17:40:39.079736 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:39.079709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:39.079848 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:39.079772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:39.079848 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:39.079795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:39.079956 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.079876 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:39.080008 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.079961 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert podName:3bf148cf-abbf-4345-8487-fd40ceff855c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.079940314 +0000 UTC m=+41.096407828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert") pod "ingress-canary-pc75n" (UID: "3bf148cf-abbf-4345-8487-fd40ceff855c") : secret "canary-serving-cert" not found Apr 16 17:40:39.080892 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.080285 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:39.080892 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.080303 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:39.080892 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.080309 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57577fdccc-fltds: secret "image-registry-tls" not found Apr 16 17:40:39.080892 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.080366 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls podName:a3c3daf4-9be0-4443-af47-0a67097365fa nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.080349202 +0000 UTC m=+41.096816714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls") pod "image-registry-57577fdccc-fltds" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa") : secret "image-registry-tls" not found Apr 16 17:40:39.080892 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:39.080384 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls podName:643af40e-aabc-4fb6-8e21-7926a029b0a0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.080374316 +0000 UTC m=+41.096841817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls") pod "dns-default-vszkl" (UID: "643af40e-aabc-4fb6-8e21-7926a029b0a0") : secret "dns-default-metrics-tls" not found Apr 16 17:40:40.482371 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.482120 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-8q7zf"] Apr 16 17:40:40.499964 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.499936 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-8q7zf"] Apr 16 17:40:40.500113 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.500043 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.504878 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.504853 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 17:40:40.508536 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.508513 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qlndg\"" Apr 16 17:40:40.508674 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.508603 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 17:40:40.508674 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.508608 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 17:40:40.508789 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.508776 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 17:40:40.695277 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.695240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c4618c0-4652-426d-8d6e-55ebeef5c000-signing-cabundle\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.695455 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.695297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c4618c0-4652-426d-8d6e-55ebeef5c000-signing-key\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.695455 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.695329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzz5n\" (UniqueName: \"kubernetes.io/projected/8c4618c0-4652-426d-8d6e-55ebeef5c000-kube-api-access-xzz5n\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.770719 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.770626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bs2mp" event={"ID":"5a330027-b9ff-458e-aa78-e2eb5a0bda58","Type":"ContainerStarted","Data":"7c8477b81a39a6cec1481b72b88d7ef8c4eb52f68aaa3b6a3b396fb1f8ec5f4b"} Apr 16 17:40:40.770870 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.770728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:40:40.772114 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.772093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" event={"ID":"d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af","Type":"ContainerStarted","Data":"24509f63efe9dc3220048c8cd3aba9bb9bafc72648d694b184039c2bf979e02b"} Apr 16 17:40:40.772114 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.772117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" event={"ID":"d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af","Type":"ContainerStarted","Data":"e215e470d1beb58ae926d41c874e65eaa57af6ebe95e407b4e0c8b234b4da4dc"} Apr 16 17:40:40.774765 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.774746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t42f2" event={"ID":"404aadc7-59c7-4274-841e-38902a95c670","Type":"ContainerStarted","Data":"3922bf3bb6eee1393fff2e7f54113b5fc1c93cdcaaf5a75b7e4d481da83b71a9"} Apr 16 17:40:40.793567 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.793520 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bs2mp" podStartSLOduration=35.36373854 podStartE2EDuration="38.793504736s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:36.741974315 +0000 UTC m=+34.758441813" lastFinishedPulling="2026-04-16 17:40:40.171740508 +0000 UTC m=+38.188208009" observedRunningTime="2026-04-16 17:40:40.792826675 +0000 UTC m=+38.809294195" watchObservedRunningTime="2026-04-16 17:40:40.793504736 +0000 UTC m=+38.809972250" Apr 16 17:40:40.796092 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.796070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c4618c0-4652-426d-8d6e-55ebeef5c000-signing-cabundle\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.796185 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.796121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c4618c0-4652-426d-8d6e-55ebeef5c000-signing-key\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.796185 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.796151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzz5n\" (UniqueName: \"kubernetes.io/projected/8c4618c0-4652-426d-8d6e-55ebeef5c000-kube-api-access-xzz5n\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.799647 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.799622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c4618c0-4652-426d-8d6e-55ebeef5c000-signing-key\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.807922 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.807881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c4618c0-4652-426d-8d6e-55ebeef5c000-signing-cabundle\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.809382 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.809360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzz5n\" (UniqueName: \"kubernetes.io/projected/8c4618c0-4652-426d-8d6e-55ebeef5c000-kube-api-access-xzz5n\") pod \"service-ca-bfc587fb7-8q7zf\" (UID: \"8c4618c0-4652-426d-8d6e-55ebeef5c000\") " pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:40.820616 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.820572 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5vrzw" podStartSLOduration=1.720489818 podStartE2EDuration="3.820558705s" podCreationTimestamp="2026-04-16 17:40:37 +0000 UTC" firstStartedPulling="2026-04-16 17:40:38.073063824 +0000 UTC m=+36.089531323" lastFinishedPulling="2026-04-16 17:40:40.173132711 +0000 UTC m=+38.189600210" observedRunningTime="2026-04-16 17:40:40.81933716 +0000 UTC m=+38.835804682" watchObservedRunningTime="2026-04-16 17:40:40.820558705 +0000 UTC m=+38.837026237" Apr 16 17:40:40.845619 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:40.845566 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t42f2" podStartSLOduration=7.463384648 podStartE2EDuration="38.845550544s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:05.383257956 +0000 UTC m=+3.399725456" lastFinishedPulling="2026-04-16 17:40:36.76542385 +0000 UTC m=+34.781891352" observedRunningTime="2026-04-16 17:40:40.845443017 +0000 UTC m=+38.861910536" watchObservedRunningTime="2026-04-16 17:40:40.845550544 +0000 UTC m=+38.862018065" Apr 16 17:40:41.109336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:41.109252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" Apr 16 17:40:41.264360 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:41.263436 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-8q7zf"] Apr 16 17:40:41.267166 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:41.267132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c4618c0_4652_426d_8d6e_55ebeef5c000.slice/crio-a91353f142f81982a4889290e9a1c0fadb2a4cb9a7fbbf5f0f89c11b44373b9b WatchSource:0}: Error finding container a91353f142f81982a4889290e9a1c0fadb2a4cb9a7fbbf5f0f89c11b44373b9b: Status 404 returned error can't find the container with id a91353f142f81982a4889290e9a1c0fadb2a4cb9a7fbbf5f0f89c11b44373b9b Apr 16 17:40:41.778425 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:41.778385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" event={"ID":"8c4618c0-4652-426d-8d6e-55ebeef5c000","Type":"ContainerStarted","Data":"a91353f142f81982a4889290e9a1c0fadb2a4cb9a7fbbf5f0f89c11b44373b9b"} Apr 16 17:40:43.116046 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.116001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.116060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.116087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116149 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116174 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57577fdccc-fltds: secret "image-registry-tls" not found Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116215 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116236 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls podName:a3c3daf4-9be0-4443-af47-0a67097365fa nodeName:}" failed. No retries permitted until 2026-04-16 17:40:51.116221325 +0000 UTC m=+49.132688824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls") pod "image-registry-57577fdccc-fltds" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa") : secret "image-registry-tls" not found Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116148 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116265 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls podName:643af40e-aabc-4fb6-8e21-7926a029b0a0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:51.116250211 +0000 UTC m=+49.132717714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls") pod "dns-default-vszkl" (UID: "643af40e-aabc-4fb6-8e21-7926a029b0a0") : secret "dns-default-metrics-tls" not found Apr 16 17:40:43.116482 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:40:43.116282 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert podName:3bf148cf-abbf-4345-8487-fd40ceff855c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:51.116272112 +0000 UTC m=+49.132739611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert") pod "ingress-canary-pc75n" (UID: "3bf148cf-abbf-4345-8487-fd40ceff855c") : secret "canary-serving-cert" not found Apr 16 17:40:43.518781 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.518751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:43.521254 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.521227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/908ddae4-92b4-4772-acb3-f93ed0f019ea-original-pull-secret\") pod \"global-pull-secret-syncer-hxk67\" (UID: \"908ddae4-92b4-4772-acb3-f93ed0f019ea\") " pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:43.741062 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.740982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxk67" Apr 16 17:40:43.782963 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.782932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" event={"ID":"8c4618c0-4652-426d-8d6e-55ebeef5c000","Type":"ContainerStarted","Data":"8dbba3be959644ed68920672da2cf58f3178b9463541313877dc050243d44ace"} Apr 16 17:40:43.810288 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.810238 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-8q7zf" podStartSLOduration=1.644036001 podStartE2EDuration="3.810223075s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:41.268731658 +0000 UTC m=+39.285199160" lastFinishedPulling="2026-04-16 17:40:43.434918722 +0000 UTC m=+41.451386234" observedRunningTime="2026-04-16 17:40:43.809140811 +0000 UTC m=+41.825608332" watchObservedRunningTime="2026-04-16 17:40:43.810223075 +0000 UTC m=+41.826690625" Apr 16 17:40:43.860052 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:43.860022 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxk67"] Apr 16 17:40:43.863247 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:43.863212 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908ddae4_92b4_4772_acb3_f93ed0f019ea.slice/crio-64c130a62b7097cbd724139ed7e211a212070f4ab3794f61cbdd9ae2e8f91520 WatchSource:0}: Error finding container 64c130a62b7097cbd724139ed7e211a212070f4ab3794f61cbdd9ae2e8f91520: Status 404 returned error can't find the container with id 64c130a62b7097cbd724139ed7e211a212070f4ab3794f61cbdd9ae2e8f91520 Apr 16 17:40:44.786288 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:44.786242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxk67" event={"ID":"908ddae4-92b4-4772-acb3-f93ed0f019ea","Type":"ContainerStarted","Data":"64c130a62b7097cbd724139ed7e211a212070f4ab3794f61cbdd9ae2e8f91520"} Apr 16 17:40:48.795857 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:48.795824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxk67" event={"ID":"908ddae4-92b4-4772-acb3-f93ed0f019ea","Type":"ContainerStarted","Data":"4ee092d5317fe19ead1eae3350cc92d698a6881f383a70c33e1c1e08000d5410"} Apr 16 17:40:51.187007 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.186968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:51.187413 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.187029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:51.187413 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.187058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:51.189550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.189522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"image-registry-57577fdccc-fltds\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:51.189695 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.189522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/643af40e-aabc-4fb6-8e21-7926a029b0a0-metrics-tls\") pod \"dns-default-vszkl\" (UID: \"643af40e-aabc-4fb6-8e21-7926a029b0a0\") " pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:51.189695 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.189615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bf148cf-abbf-4345-8487-fd40ceff855c-cert\") pod \"ingress-canary-pc75n\" (UID: \"3bf148cf-abbf-4345-8487-fd40ceff855c\") " pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:51.243217 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.243178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:51.259070 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.259039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:51.271973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.271948 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pc75n" Apr 16 17:40:51.395087 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.395037 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hxk67" podStartSLOduration=19.696813439 podStartE2EDuration="24.395020249s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:43.865079751 +0000 UTC m=+41.881547250" lastFinishedPulling="2026-04-16 17:40:48.563286548 +0000 UTC m=+46.579754060" observedRunningTime="2026-04-16 17:40:48.81541043 +0000 UTC m=+46.831877951" watchObservedRunningTime="2026-04-16 17:40:51.395020249 +0000 UTC m=+49.411487767" Apr 16 17:40:51.395841 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.395817 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57577fdccc-fltds"] Apr 16 17:40:51.399716 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:51.399690 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c3daf4_9be0_4443_af47_0a67097365fa.slice/crio-bd17b89a355226014aa89f252f6f5fa8d71b95e01889d9e7bb4c8c9277bd884c WatchSource:0}: Error finding container bd17b89a355226014aa89f252f6f5fa8d71b95e01889d9e7bb4c8c9277bd884c: Status 404 returned error can't find the container with id bd17b89a355226014aa89f252f6f5fa8d71b95e01889d9e7bb4c8c9277bd884c Apr 16 17:40:51.413645 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.413613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vszkl"] Apr 16 17:40:51.416402 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:51.416381 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643af40e_aabc_4fb6_8e21_7926a029b0a0.slice/crio-ceb5a8ccb19a3b64f0bec05a9cc6b198baf49c1e97851dfeb0db16fc8a4db94f WatchSource:0}: Error finding container ceb5a8ccb19a3b64f0bec05a9cc6b198baf49c1e97851dfeb0db16fc8a4db94f: Status 404 returned error can't find the container with id ceb5a8ccb19a3b64f0bec05a9cc6b198baf49c1e97851dfeb0db16fc8a4db94f Apr 16 17:40:51.428529 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.428475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pc75n"] Apr 16 17:40:51.433668 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:40:51.433640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf148cf_abbf_4345_8487_fd40ceff855c.slice/crio-cc87d6ea77c6a1983dfa406af39eed64bd6b22bed21d39ddb305bcf176981247 WatchSource:0}: Error finding container cc87d6ea77c6a1983dfa406af39eed64bd6b22bed21d39ddb305bcf176981247: Status 404 returned error can't find the container with id cc87d6ea77c6a1983dfa406af39eed64bd6b22bed21d39ddb305bcf176981247 Apr 16 17:40:51.808772 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.808686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57577fdccc-fltds" event={"ID":"a3c3daf4-9be0-4443-af47-0a67097365fa","Type":"ContainerStarted","Data":"d6d6712d5f748543dc8ef283ca782bbf028b4029821001871bfbb87bb6282772"} Apr 16 17:40:51.808772 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.808732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57577fdccc-fltds" event={"ID":"a3c3daf4-9be0-4443-af47-0a67097365fa","Type":"ContainerStarted","Data":"bd17b89a355226014aa89f252f6f5fa8d71b95e01889d9e7bb4c8c9277bd884c"} Apr 16 17:40:51.808994 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.808865 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:40:51.810845 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.810808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pc75n" event={"ID":"3bf148cf-abbf-4345-8487-fd40ceff855c","Type":"ContainerStarted","Data":"cc87d6ea77c6a1983dfa406af39eed64bd6b22bed21d39ddb305bcf176981247"} Apr 16 17:40:51.812677 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.812644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vszkl" event={"ID":"643af40e-aabc-4fb6-8e21-7926a029b0a0","Type":"ContainerStarted","Data":"ceb5a8ccb19a3b64f0bec05a9cc6b198baf49c1e97851dfeb0db16fc8a4db94f"} Apr 16 17:40:51.838802 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:51.838211 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57577fdccc-fltds" podStartSLOduration=51.838194601 podStartE2EDuration="51.838194601s" podCreationTimestamp="2026-04-16 17:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:51.837014028 +0000 UTC m=+49.853481552" watchObservedRunningTime="2026-04-16 17:40:51.838194601 +0000 UTC m=+49.854662132" Apr 16 17:40:53.822018 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:53.821984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pc75n" event={"ID":"3bf148cf-abbf-4345-8487-fd40ceff855c","Type":"ContainerStarted","Data":"d54339bd30cb73aa384dd0712980ce290bfccf860024ac16fd89dc72f3cd0253"} Apr 16 17:40:53.823633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:53.823604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vszkl" event={"ID":"643af40e-aabc-4fb6-8e21-7926a029b0a0","Type":"ContainerStarted","Data":"f1e1d30ff50f1cdcf2ab52e7c232cddbddb01e45e007b99e77b2dff29a2a081f"} Apr 16 17:40:53.823633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:53.823637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vszkl" event={"ID":"643af40e-aabc-4fb6-8e21-7926a029b0a0","Type":"ContainerStarted","Data":"9946f98624a22ef6c7610e1452d94750d131660dafc28a589fc661b87afa0218"} Apr 16 17:40:53.823821 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:53.823753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vszkl" Apr 16 17:40:53.843883 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:40:53.843841 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pc75n" podStartSLOduration=16.823957363 podStartE2EDuration="18.843828863s" podCreationTimestamp="2026-04-16 17:40:35 +0000 UTC" firstStartedPulling="2026-04-16 17:40:51.435424967 +0000 UTC m=+49.451892467" lastFinishedPulling="2026-04-16 17:40:53.455296453 +0000 UTC m=+51.471763967" observedRunningTime="2026-04-16 17:40:53.843421692 +0000 UTC m=+51.859889214" watchObservedRunningTime="2026-04-16 17:40:53.843828863 +0000 UTC m=+51.860296383" Apr 16 17:41:00.750108 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:00.750072 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8kd57" Apr 16 17:41:00.779400 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:00.779264 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vszkl" podStartSLOduration=23.745326911 podStartE2EDuration="25.779247985s" podCreationTimestamp="2026-04-16 17:40:35 +0000 UTC" firstStartedPulling="2026-04-16 17:40:51.418156527 +0000 UTC m=+49.434624027" lastFinishedPulling="2026-04-16 17:40:53.452077585 +0000 UTC m=+51.468545101" observedRunningTime="2026-04-16 17:40:53.865987618 +0000 UTC m=+51.882455140" watchObservedRunningTime="2026-04-16 17:41:00.779247985 +0000 UTC m=+58.795715506" Apr 16 17:41:03.803950 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.803895 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57577fdccc-fltds"] Apr 16 17:41:03.829269 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.829240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vszkl" Apr 16 17:41:03.839799 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.839766 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-587cd5bb74-wz624"] Apr 16 17:41:03.844192 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.844173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:03.846604 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.846573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-45q8s\"" Apr 16 17:41:03.846853 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.846834 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 17:41:03.846997 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.846897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 17:41:03.847149 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.847134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 17:41:03.847210 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.847161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 17:41:03.847274 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.847256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 17:41:03.847460 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.847440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 17:41:03.860639 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.860614 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-587cd5bb74-wz624"] Apr 16 17:41:03.924419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.924390 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc"] Apr 16 17:41:03.927234 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.927207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:03.928511 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.928485 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-r58vb"] Apr 16 17:41:03.931161 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.931145 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5t6rd"] Apr 16 17:41:03.931349 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.931336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:03.931417 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.931380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:41:03.931799 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.931784 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:41:03.932105 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.932088 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 17:41:03.933614 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.933590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 17:41:03.933704 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.933693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nmkf6\"" Apr 16 17:41:03.934181 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.934163 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:03.934264 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.934230 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 17:41:03.934264 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.934237 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wn95r\"" Apr 16 17:41:03.934368 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.934231 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 17:41:03.937550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.937532 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-96578b4d-66dzw"] Apr 16 17:41:03.940122 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:03.940048 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"operator-dockercfg-dnnbz\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"operator-dockercfg-dnnbz\"" type="*v1.Secret" Apr 16 17:41:03.940211 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:03.940119 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" type="*v1.ConfigMap" Apr 16 17:41:03.940211 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:03.940131 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"openshift-insights-serving-cert\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" type="*v1.Secret" Apr 16 17:41:03.940211 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:03.940120 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 16 17:41:03.940211 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:03.940146 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" type="*v1.ConfigMap" Apr 16 17:41:03.940211 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:03.940171 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 16 17:41:03.940370 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.940335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.942861 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.942828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc"] Apr 16 17:41:03.955315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.955275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-r58vb"] Apr 16 17:41:03.956022 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.955997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5t6rd"] Apr 16 17:41:03.968117 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.968092 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-96578b4d-66dzw"] Apr 16 17:41:03.984215 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55pr\" (UniqueName: \"kubernetes.io/projected/1b5a2170-b541-4c6d-918a-c836b3286e61-kube-api-access-k55pr\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:03.984358 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:03.984358 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-serving-cert\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:03.984358 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-registry-tls\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984358 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qm5v\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-kube-api-access-4qm5v\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984528 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-snapshots\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:03.984528 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9d84c5e-5875-4d83-9674-ab0c1accad47-trusted-ca\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984528 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-service-ca-bundle\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:03.984528 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9d84c5e-5875-4d83-9674-ab0c1accad47-ca-trust-extracted\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-stats-auth\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:03.984650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-tmp\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:03.984650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-bound-sa-token\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b94n\" (UniqueName: \"kubernetes.io/projected/421e5aad-dcec-441f-b232-eff2d7c6af79-kube-api-access-8b94n\") pod \"downloads-586b57c7b4-r58vb\" (UID: \"421e5aad-dcec-441f-b232-eff2d7c6af79\") " pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:03.984767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsp5k\" (UniqueName: \"kubernetes.io/projected/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-kube-api-access-tsp5k\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:03.984767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9d84c5e-5875-4d83-9674-ab0c1accad47-image-registry-private-configuration\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td96b\" (UniqueName: \"kubernetes.io/projected/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-kube-api-access-td96b\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:03.984767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b5a2170-b541-4c6d-918a-c836b3286e61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:03.984767 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-default-certificate\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:03.984939 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9d84c5e-5875-4d83-9674-ab0c1accad47-installation-pull-secrets\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984939 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-metrics-certs\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:03.984939 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9d84c5e-5875-4d83-9674-ab0c1accad47-registry-certificates\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:03.984939 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1b5a2170-b541-4c6d-918a-c836b3286e61-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:03.984939 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:03.984842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.085942 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9d84c5e-5875-4d83-9674-ab0c1accad47-trusted-ca\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.085942 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-service-ca-bundle\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.085942 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9d84c5e-5875-4d83-9674-ab0c1accad47-ca-trust-extracted\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.085942 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-stats-auth\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.085942 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-tmp\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-bound-sa-token\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.085980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b94n\" (UniqueName: \"kubernetes.io/projected/421e5aad-dcec-441f-b232-eff2d7c6af79-kube-api-access-8b94n\") pod \"downloads-586b57c7b4-r58vb\" (UID: \"421e5aad-dcec-441f-b232-eff2d7c6af79\") " pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsp5k\" (UniqueName: \"kubernetes.io/projected/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-kube-api-access-tsp5k\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9d84c5e-5875-4d83-9674-ab0c1accad47-image-registry-private-configuration\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td96b\" (UniqueName: \"kubernetes.io/projected/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-kube-api-access-td96b\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b5a2170-b541-4c6d-918a-c836b3286e61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-default-certificate\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9d84c5e-5875-4d83-9674-ab0c1accad47-installation-pull-secrets\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-metrics-certs\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9d84c5e-5875-4d83-9674-ab0c1accad47-registry-certificates\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1b5a2170-b541-4c6d-918a-c836b3286e61-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k55pr\" (UniqueName: \"kubernetes.io/projected/1b5a2170-b541-4c6d-918a-c836b3286e61-kube-api-access-k55pr\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.086315 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.087031 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-serving-cert\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.087031 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-registry-tls\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.087031 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qm5v\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-kube-api-access-4qm5v\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.087031 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-snapshots\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.087031 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-tmp\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.087031 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.086775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-service-ca-bundle\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.087327 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.087032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9d84c5e-5875-4d83-9674-ab0c1accad47-trusted-ca\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.087327 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.087070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-snapshots\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.087327 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.087244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9d84c5e-5875-4d83-9674-ab0c1accad47-ca-trust-extracted\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.088367 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.088336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1b5a2170-b541-4c6d-918a-c836b3286e61-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.089490 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.089439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-metrics-certs\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.089490 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.089477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-stats-auth\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.089677 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.089615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9d84c5e-5875-4d83-9674-ab0c1accad47-registry-certificates\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.089732 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.089681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b5a2170-b541-4c6d-918a-c836b3286e61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.089842 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.089812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9d84c5e-5875-4d83-9674-ab0c1accad47-image-registry-private-configuration\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.090238 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.090218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9d84c5e-5875-4d83-9674-ab0c1accad47-installation-pull-secrets\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.090238 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.090233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-registry-tls\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.090651 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.090632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-default-certificate\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.096627 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.096606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qm5v\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-kube-api-access-4qm5v\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.096839 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.096794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td96b\" (UniqueName: \"kubernetes.io/projected/2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a-kube-api-access-td96b\") pod \"router-default-587cd5bb74-wz624\" (UID: \"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a\") " pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.097000 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.096981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55pr\" (UniqueName: \"kubernetes.io/projected/1b5a2170-b541-4c6d-918a-c836b3286e61-kube-api-access-k55pr\") pod \"cluster-monitoring-operator-6667474d89-gvbhc\" (UID: \"1b5a2170-b541-4c6d-918a-c836b3286e61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.097721 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.097700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b94n\" (UniqueName: \"kubernetes.io/projected/421e5aad-dcec-441f-b232-eff2d7c6af79-kube-api-access-8b94n\") pod \"downloads-586b57c7b4-r58vb\" (UID: \"421e5aad-dcec-441f-b232-eff2d7c6af79\") " pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:04.098875 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.098847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9d84c5e-5875-4d83-9674-ab0c1accad47-bound-sa-token\") pod \"image-registry-96578b4d-66dzw\" (UID: \"e9d84c5e-5875-4d83-9674-ab0c1accad47\") " pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.152896 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.152857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:04.238461 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.238433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" Apr 16 17:41:04.244716 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.244664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:04.261744 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.261621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.285478 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.285443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-587cd5bb74-wz624"] Apr 16 17:41:04.289573 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:04.289519 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c0dcd7e_0c96_40a1_a6fd_2c9e2109db1a.slice/crio-b62cf897eb581c3b561ccfcaa54cfe560bb06472172013886d344b3c727bdb0a WatchSource:0}: Error finding container b62cf897eb581c3b561ccfcaa54cfe560bb06472172013886d344b3c727bdb0a: Status 404 returned error can't find the container with id b62cf897eb581c3b561ccfcaa54cfe560bb06472172013886d344b3c727bdb0a Apr 16 17:41:04.392319 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.392252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc"] Apr 16 17:41:04.396383 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:04.396342 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5a2170_b541_4c6d_918a_c836b3286e61.slice/crio-831d13fe5f47d574563338bbbcd2f996469f57c6fcf6882915e9e6237438213a WatchSource:0}: Error finding container 831d13fe5f47d574563338bbbcd2f996469f57c6fcf6882915e9e6237438213a: Status 404 returned error can't find the container with id 831d13fe5f47d574563338bbbcd2f996469f57c6fcf6882915e9e6237438213a Apr 16 17:41:04.410874 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.410814 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-r58vb"] Apr 16 17:41:04.415120 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:04.415088 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421e5aad_dcec_441f_b232_eff2d7c6af79.slice/crio-57ad91d46d3b4180a07a5e14c813e3c88faf2c2ec34ab907740358239ddf66cf WatchSource:0}: Error finding container 57ad91d46d3b4180a07a5e14c813e3c88faf2c2ec34ab907740358239ddf66cf: Status 404 returned error can't find the container with id 57ad91d46d3b4180a07a5e14c813e3c88faf2c2ec34ab907740358239ddf66cf Apr 16 17:41:04.434738 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.434629 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-96578b4d-66dzw"] Apr 16 17:41:04.437125 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:04.437099 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d84c5e_5875_4d83_9674_ab0c1accad47.slice/crio-e0dc0c7212e06e16c667e1ab5589a95de1dd300ed1f5908431ff5f43d2cc9688 WatchSource:0}: Error finding container e0dc0c7212e06e16c667e1ab5589a95de1dd300ed1f5908431ff5f43d2cc9688: Status 404 returned error can't find the container with id e0dc0c7212e06e16c667e1ab5589a95de1dd300ed1f5908431ff5f43d2cc9688 Apr 16 17:41:04.786428 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.786396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 17:41:04.789035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.789013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:04.852919 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.852871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96578b4d-66dzw" event={"ID":"e9d84c5e-5875-4d83-9674-ab0c1accad47","Type":"ContainerStarted","Data":"aa5f5ec3cacd6b14b3edf2b363c0c086aae345adebb997ed748ccb9d0abfe1a1"} Apr 16 17:41:04.853336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.852928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96578b4d-66dzw" event={"ID":"e9d84c5e-5875-4d83-9674-ab0c1accad47","Type":"ContainerStarted","Data":"e0dc0c7212e06e16c667e1ab5589a95de1dd300ed1f5908431ff5f43d2cc9688"} Apr 16 17:41:04.853336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.853088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:04.854060 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.854039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" event={"ID":"1b5a2170-b541-4c6d-918a-c836b3286e61","Type":"ContainerStarted","Data":"831d13fe5f47d574563338bbbcd2f996469f57c6fcf6882915e9e6237438213a"} Apr 16 17:41:04.855022 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.854997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-r58vb" event={"ID":"421e5aad-dcec-441f-b232-eff2d7c6af79","Type":"ContainerStarted","Data":"57ad91d46d3b4180a07a5e14c813e3c88faf2c2ec34ab907740358239ddf66cf"} Apr 16 17:41:04.856164 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.856140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-587cd5bb74-wz624" event={"ID":"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a","Type":"ContainerStarted","Data":"be3903272d975efb8310e100f666b8c81768ebc6b7da5e7003d60c0395aa140b"} Apr 16 17:41:04.856260 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.856169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-587cd5bb74-wz624" event={"ID":"2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a","Type":"ContainerStarted","Data":"b62cf897eb581c3b561ccfcaa54cfe560bb06472172013886d344b3c727bdb0a"} Apr 16 17:41:04.876224 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.876113 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-96578b4d-66dzw" podStartSLOduration=1.8760968139999998 podStartE2EDuration="1.876096814s" podCreationTimestamp="2026-04-16 17:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:04.875493434 +0000 UTC m=+62.891960956" watchObservedRunningTime="2026-04-16 17:41:04.876096814 +0000 UTC m=+62.892564335" Apr 16 17:41:04.898676 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.898615 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-587cd5bb74-wz624" podStartSLOduration=1.898594941 podStartE2EDuration="1.898594941s" podCreationTimestamp="2026-04-16 17:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:04.897360951 +0000 UTC m=+62.913828473" watchObservedRunningTime="2026-04-16 17:41:04.898594941 +0000 UTC m=+62.915062526" Apr 16 17:41:04.957292 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.957254 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-546f885c99-8d5wc"] Apr 16 17:41:04.960502 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.960470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:04.964189 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.963716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 17:41:04.964189 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.963930 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 17:41:04.964189 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.963944 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lwftg\"" Apr 16 17:41:04.964189 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.963957 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 17:41:04.964527 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.964506 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 17:41:04.964632 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.964585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 17:41:04.972588 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.972565 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546f885c99-8d5wc"] Apr 16 17:41:04.999144 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:04.999096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-dnnbz\"" Apr 16 17:41:05.087568 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.087538 2576 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Apr 16 17:41:05.087760 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.087538 2576 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 16 17:41:05.087760 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.087655 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-service-ca-bundle podName:d2bce270-4a99-4e0a-a841-56b07aa9d0cb nodeName:}" failed. No retries permitted until 2026-04-16 17:41:05.587634124 +0000 UTC m=+63.604101639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-service-ca-bundle") pod "insights-operator-5785d4fcdd-5t6rd" (UID: "d2bce270-4a99-4e0a-a841-56b07aa9d0cb") : failed to sync configmap cache: timed out waiting for the condition Apr 16 17:41:05.087760 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.087718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-serving-cert podName:d2bce270-4a99-4e0a-a841-56b07aa9d0cb nodeName:}" failed. No retries permitted until 2026-04-16 17:41:05.587698702 +0000 UTC m=+63.604166201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-serving-cert") pod "insights-operator-5785d4fcdd-5t6rd" (UID: "d2bce270-4a99-4e0a-a841-56b07aa9d0cb") : failed to sync secret cache: timed out waiting for the condition Apr 16 17:41:05.094487 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.094447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-console-config\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.094643 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.094569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-oauth-serving-cert\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.094643 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.094622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-oauth-config\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.094743 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.094686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-serving-cert\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.094790 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.094743 2576 projected.go:289] Couldn't get configMap openshift-insights/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 16 17:41:05.094790 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.094752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrfmt\" (UniqueName: \"kubernetes.io/projected/67580a9a-b1fe-4921-9484-c90f744ccc62-kube-api-access-lrfmt\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.094894 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.094822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-service-ca\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.150076 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.150042 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 17:41:05.153636 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.153607 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:05.156498 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.156475 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:05.196423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.196112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-console-config\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.196423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.196206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-oauth-serving-cert\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.196423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.196250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-oauth-config\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.196423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.196314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-serving-cert\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.196423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.196355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrfmt\" (UniqueName: \"kubernetes.io/projected/67580a9a-b1fe-4921-9484-c90f744ccc62-kube-api-access-lrfmt\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.196423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.196389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-service-ca\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.197073 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.197016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-console-config\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.197073 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.197018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-oauth-serving-cert\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.197547 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.197521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-service-ca\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.200331 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.200296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-serving-cert\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.200946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.200881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-oauth-config\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.206835 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.206815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrfmt\" (UniqueName: \"kubernetes.io/projected/67580a9a-b1fe-4921-9484-c90f744ccc62-kube-api-access-lrfmt\") pod \"console-546f885c99-8d5wc\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.264141 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.264109 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:41:05.265146 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.265117 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tsp5k for pod openshift-insights/insights-operator-5785d4fcdd-5t6rd: failed to sync configmap cache: timed out waiting for the condition Apr 16 17:41:05.265249 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:05.265204 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-kube-api-access-tsp5k podName:d2bce270-4a99-4e0a-a841-56b07aa9d0cb nodeName:}" failed. No retries permitted until 2026-04-16 17:41:05.765181857 +0000 UTC m=+63.781649374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tsp5k" (UniqueName: "kubernetes.io/projected/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-kube-api-access-tsp5k") pod "insights-operator-5785d4fcdd-5t6rd" (UID: "d2bce270-4a99-4e0a-a841-56b07aa9d0cb") : failed to sync configmap cache: timed out waiting for the condition Apr 16 17:41:05.272748 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.272717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:05.304120 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.303932 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:41:05.339172 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.338884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 17:41:05.425118 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.425020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546f885c99-8d5wc"] Apr 16 17:41:05.428709 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:05.428668 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67580a9a_b1fe_4921_9484_c90f744ccc62.slice/crio-3037168672db451ad5b1fc4c86bd3eca8de1f8b543ea650077774eed196b53bf WatchSource:0}: Error finding container 3037168672db451ad5b1fc4c86bd3eca8de1f8b543ea650077774eed196b53bf: Status 404 returned error can't find the container with id 3037168672db451ad5b1fc4c86bd3eca8de1f8b543ea650077774eed196b53bf Apr 16 17:41:05.600798 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.600755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:05.600988 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.600807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-serving-cert\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:05.601605 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.601578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:05.604213 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.604146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-serving-cert\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:05.802572 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.802243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsp5k\" (UniqueName: \"kubernetes.io/projected/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-kube-api-access-tsp5k\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:05.805534 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.805466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsp5k\" (UniqueName: \"kubernetes.io/projected/d2bce270-4a99-4e0a-a841-56b07aa9d0cb-kube-api-access-tsp5k\") pod \"insights-operator-5785d4fcdd-5t6rd\" (UID: \"d2bce270-4a99-4e0a-a841-56b07aa9d0cb\") " pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:05.861682 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.860878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546f885c99-8d5wc" event={"ID":"67580a9a-b1fe-4921-9484-c90f744ccc62","Type":"ContainerStarted","Data":"3037168672db451ad5b1fc4c86bd3eca8de1f8b543ea650077774eed196b53bf"} Apr 16 17:41:05.861682 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.861498 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:05.862945 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:05.862922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-587cd5bb74-wz624" Apr 16 17:41:06.052175 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:06.052136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" Apr 16 17:41:06.496784 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:06.496626 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5t6rd"] Apr 16 17:41:06.499527 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:06.499495 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2bce270_4a99_4e0a_a841_56b07aa9d0cb.slice/crio-9e8449d6f4ace69123b7123653d3d24a7eba8dce7e10ad0dae76d9cc47be9ba6 WatchSource:0}: Error finding container 9e8449d6f4ace69123b7123653d3d24a7eba8dce7e10ad0dae76d9cc47be9ba6: Status 404 returned error can't find the container with id 9e8449d6f4ace69123b7123653d3d24a7eba8dce7e10ad0dae76d9cc47be9ba6 Apr 16 17:41:06.865439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:06.865393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" event={"ID":"d2bce270-4a99-4e0a-a841-56b07aa9d0cb","Type":"ContainerStarted","Data":"9e8449d6f4ace69123b7123653d3d24a7eba8dce7e10ad0dae76d9cc47be9ba6"} Apr 16 17:41:06.868085 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:06.867188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" event={"ID":"1b5a2170-b541-4c6d-918a-c836b3286e61","Type":"ContainerStarted","Data":"e8d4cb2129d9e53e092c73a307468e25be52286ffaed02cc6b301276c25ea844"} Apr 16 17:41:07.332477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.332416 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" podStartSLOduration=2.023728359 podStartE2EDuration="4.332392161s" podCreationTimestamp="2026-04-16 17:41:03 +0000 UTC" firstStartedPulling="2026-04-16 17:41:04.398414974 +0000 UTC m=+62.414882477" lastFinishedPulling="2026-04-16 17:41:06.70707878 +0000 UTC m=+64.723546279" observedRunningTime="2026-04-16 17:41:06.88781805 +0000 UTC m=+64.904285572" watchObservedRunningTime="2026-04-16 17:41:07.332392161 +0000 UTC m=+65.348859686" Apr 16 17:41:07.334164 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.333198 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx"] Apr 16 17:41:07.365231 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.365153 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx"] Apr 16 17:41:07.365429 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.365302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:07.367962 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.367897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 17:41:07.368299 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.368263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-t7l57\"" Apr 16 17:41:07.417561 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.417498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9f2ec8b8-30a4-4029-bcbf-b65983bf98df-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-plrpx\" (UID: \"9f2ec8b8-30a4-4029-bcbf-b65983bf98df\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:07.519455 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:07.518991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9f2ec8b8-30a4-4029-bcbf-b65983bf98df-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-plrpx\" (UID: \"9f2ec8b8-30a4-4029-bcbf-b65983bf98df\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:07.519455 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:07.519154 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 17:41:07.519455 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:07.519216 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f2ec8b8-30a4-4029-bcbf-b65983bf98df-tls-certificates podName:9f2ec8b8-30a4-4029-bcbf-b65983bf98df nodeName:}" failed. No retries permitted until 2026-04-16 17:41:08.019197244 +0000 UTC m=+66.035664746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9f2ec8b8-30a4-4029-bcbf-b65983bf98df-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-plrpx" (UID: "9f2ec8b8-30a4-4029-bcbf-b65983bf98df") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 17:41:08.022449 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.022408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9f2ec8b8-30a4-4029-bcbf-b65983bf98df-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-plrpx\" (UID: \"9f2ec8b8-30a4-4029-bcbf-b65983bf98df\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:08.025836 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.025782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9f2ec8b8-30a4-4029-bcbf-b65983bf98df-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-plrpx\" (UID: \"9f2ec8b8-30a4-4029-bcbf-b65983bf98df\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:08.279663 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.279560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:08.325304 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.325268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:41:08.328883 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.328849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98ed775b-36f2-475e-9c1b-e1e3a5261ed5-metrics-certs\") pod \"network-metrics-daemon-tw2xb\" (UID: \"98ed775b-36f2-475e-9c1b-e1e3a5261ed5\") " pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:41:08.339325 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.339262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdxwt\"" Apr 16 17:41:08.347223 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.346901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tw2xb" Apr 16 17:41:08.435612 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:08.435558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx"] Apr 16 17:41:09.000992 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:09.000951 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f2ec8b8_30a4_4029_bcbf_b65983bf98df.slice/crio-811db2aa4d8b3596b9376b24d3dd456b82c44ec62c939621638665b70b321ab9 WatchSource:0}: Error finding container 811db2aa4d8b3596b9376b24d3dd456b82c44ec62c939621638665b70b321ab9: Status 404 returned error can't find the container with id 811db2aa4d8b3596b9376b24d3dd456b82c44ec62c939621638665b70b321ab9 Apr 16 17:41:09.887717 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:09.887664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" event={"ID":"9f2ec8b8-30a4-4029-bcbf-b65983bf98df","Type":"ContainerStarted","Data":"811db2aa4d8b3596b9376b24d3dd456b82c44ec62c939621638665b70b321ab9"} Apr 16 17:41:10.944260 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:10.944221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tw2xb"] Apr 16 17:41:10.949287 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:10.949257 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98ed775b_36f2_475e_9c1b_e1e3a5261ed5.slice/crio-a894b39c16ef097d21d97e034ecf3daeb4a26155f394a94f1642b90e04367069 WatchSource:0}: Error finding container a894b39c16ef097d21d97e034ecf3daeb4a26155f394a94f1642b90e04367069: Status 404 returned error can't find the container with id a894b39c16ef097d21d97e034ecf3daeb4a26155f394a94f1642b90e04367069 Apr 16 17:41:11.782342 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.781758 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bs2mp" Apr 16 17:41:11.897751 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.897722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" event={"ID":"d2bce270-4a99-4e0a-a841-56b07aa9d0cb","Type":"ContainerStarted","Data":"484b05fc4820b9487e912d47c1c86b5d97877476a62856f5f8ebfcf1f6f75d28"} Apr 16 17:41:11.899354 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.899322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" event={"ID":"9f2ec8b8-30a4-4029-bcbf-b65983bf98df","Type":"ContainerStarted","Data":"6d2a6ca2454884d7c931db1fe856e1510ffcfe2a2c5403f1329c0155b85f506f"} Apr 16 17:41:11.899702 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.899680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:11.901116 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.901081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tw2xb" event={"ID":"98ed775b-36f2-475e-9c1b-e1e3a5261ed5","Type":"ContainerStarted","Data":"a894b39c16ef097d21d97e034ecf3daeb4a26155f394a94f1642b90e04367069"} Apr 16 17:41:11.903721 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.903687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546f885c99-8d5wc" event={"ID":"67580a9a-b1fe-4921-9484-c90f744ccc62","Type":"ContainerStarted","Data":"d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4"} Apr 16 17:41:11.906512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.906492 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" Apr 16 17:41:11.917798 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.917540 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-5t6rd" podStartSLOduration=4.613962397 podStartE2EDuration="8.917511775s" podCreationTimestamp="2026-04-16 17:41:03 +0000 UTC" firstStartedPulling="2026-04-16 17:41:06.501727548 +0000 UTC m=+64.518195048" lastFinishedPulling="2026-04-16 17:41:10.805276913 +0000 UTC m=+68.821744426" observedRunningTime="2026-04-16 17:41:11.917035643 +0000 UTC m=+69.933503165" watchObservedRunningTime="2026-04-16 17:41:11.917511775 +0000 UTC m=+69.933979299" Apr 16 17:41:11.937366 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.937096 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-plrpx" podStartSLOduration=2.308524352 podStartE2EDuration="4.937075924s" podCreationTimestamp="2026-04-16 17:41:07 +0000 UTC" firstStartedPulling="2026-04-16 17:41:09.003436389 +0000 UTC m=+67.019903892" lastFinishedPulling="2026-04-16 17:41:11.631987956 +0000 UTC m=+69.648455464" observedRunningTime="2026-04-16 17:41:11.934660375 +0000 UTC m=+69.951127897" watchObservedRunningTime="2026-04-16 17:41:11.937075924 +0000 UTC m=+69.953543445" Apr 16 17:41:11.954806 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:11.954061 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-546f885c99-8d5wc" podStartSLOduration=2.571130852 podStartE2EDuration="7.954038649s" podCreationTimestamp="2026-04-16 17:41:04 +0000 UTC" firstStartedPulling="2026-04-16 17:41:05.431198502 +0000 UTC m=+63.447666018" lastFinishedPulling="2026-04-16 17:41:10.814106301 +0000 UTC m=+68.830573815" observedRunningTime="2026-04-16 17:41:11.952783039 +0000 UTC m=+69.969250564" watchObservedRunningTime="2026-04-16 17:41:11.954038649 +0000 UTC m=+69.970506171" Apr 16 17:41:12.909347 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:12.909285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tw2xb" event={"ID":"98ed775b-36f2-475e-9c1b-e1e3a5261ed5","Type":"ContainerStarted","Data":"1956359f9a3f5c997101d88484484308253b7a2cffbef23fcb6dc8c34a5d4b4e"} Apr 16 17:41:12.909347 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:12.909347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tw2xb" event={"ID":"98ed775b-36f2-475e-9c1b-e1e3a5261ed5","Type":"ContainerStarted","Data":"7ae8c4177c4de3f83acd1ec756eab0d4639ffb14db6a2a88e56967b8d1392a4a"} Apr 16 17:41:12.944846 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:12.944784 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tw2xb" podStartSLOduration=69.535479763 podStartE2EDuration="1m10.94476699s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:41:10.953697482 +0000 UTC m=+68.970164999" lastFinishedPulling="2026-04-16 17:41:12.362984721 +0000 UTC m=+70.379452226" observedRunningTime="2026-04-16 17:41:12.943292294 +0000 UTC m=+70.959759816" watchObservedRunningTime="2026-04-16 17:41:12.94476699 +0000 UTC m=+70.961234512" Apr 16 17:41:13.810213 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:13.810181 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:41:13.810659 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:13.810392 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vszkl_643af40e-aabc-4fb6-8e21-7926a029b0a0/dns/0.log" Apr 16 17:41:13.994600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:13.994576 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vszkl_643af40e-aabc-4fb6-8e21-7926a029b0a0/kube-rbac-proxy/0.log" Apr 16 17:41:14.395327 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:14.395294 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p4dbv_559281b5-6292-4642-95ef-022daeacb46b/dns-node-resolver/0.log" Apr 16 17:41:14.793943 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:14.793898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-57577fdccc-fltds_a3c3daf4-9be0-4443-af47-0a67097365fa/registry/0.log" Apr 16 17:41:15.194858 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.194833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-96578b4d-66dzw_e9d84c5e-5875-4d83-9674-ab0c1accad47/registry/0.log" Apr 16 17:41:15.273396 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.273358 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:15.273591 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.273421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:15.278866 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.278832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:15.394640 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.394615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cbg2b_77172d03-834d-4c9b-8b9c-2d2f57a663cd/node-ca/0.log" Apr 16 17:41:15.924629 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.924599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:41:15.994742 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:15.994707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-587cd5bb74-wz624_2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a/router/0.log" Apr 16 17:41:16.394342 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:16.394256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pc75n_3bf148cf-abbf-4345-8487-fd40ceff855c/serve-healthcheck-canary/0.log" Apr 16 17:41:17.835920 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.835380 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-j556q"] Apr 16 17:41:17.872252 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.872132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:17.881382 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.881358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:41:17.881510 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.881489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wqdvb\"" Apr 16 17:41:17.882292 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.881696 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:41:17.882292 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.881781 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:41:17.882292 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:17.881696 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:41:18.018407 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b08bc540-ee52-405a-91b1-6b666ac80a17-metrics-client-ca\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018607 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-root\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018607 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-accelerators-collector-config\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018725 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-textfile\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018725 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-tls\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018814 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bts\" (UniqueName: \"kubernetes.io/projected/b08bc540-ee52-405a-91b1-6b666ac80a17-kube-api-access-b8bts\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018814 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018814 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-sys\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.018984 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.018827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-wtmp\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120134 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-wtmp\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120134 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b08bc540-ee52-405a-91b1-6b666ac80a17-metrics-client-ca\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-root\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-accelerators-collector-config\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-textfile\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-tls\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-wtmp\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bts\" (UniqueName: \"kubernetes.io/projected/b08bc540-ee52-405a-91b1-6b666ac80a17-kube-api-access-b8bts\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-sys\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120763 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-sys\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120763 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-textfile\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120874 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b08bc540-ee52-405a-91b1-6b666ac80a17-metrics-client-ca\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120874 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-accelerators-collector-config\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.120874 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.120819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b08bc540-ee52-405a-91b1-6b666ac80a17-root\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.123311 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.123282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.123437 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.123378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b08bc540-ee52-405a-91b1-6b666ac80a17-node-exporter-tls\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.145598 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.145535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bts\" (UniqueName: \"kubernetes.io/projected/b08bc540-ee52-405a-91b1-6b666ac80a17-kube-api-access-b8bts\") pod \"node-exporter-j556q\" (UID: \"b08bc540-ee52-405a-91b1-6b666ac80a17\") " pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:18.185509 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:18.185469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j556q" Apr 16 17:41:21.941366 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:21.941329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j556q" event={"ID":"b08bc540-ee52-405a-91b1-6b666ac80a17","Type":"ContainerStarted","Data":"31e41ec5d53f698aa10532f94be97dcedb41430cec9109378e369a6069d310fe"} Apr 16 17:41:22.946563 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:22.946470 2576 generic.go:358] "Generic (PLEG): container finished" podID="b08bc540-ee52-405a-91b1-6b666ac80a17" containerID="b7da62af2077554847000cecffd5a03294f8ebdb8ae1c141f0b079bfc36582c8" exitCode=0 Apr 16 17:41:22.946563 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:22.946541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j556q" event={"ID":"b08bc540-ee52-405a-91b1-6b666ac80a17","Type":"ContainerDied","Data":"b7da62af2077554847000cecffd5a03294f8ebdb8ae1c141f0b079bfc36582c8"} Apr 16 17:41:22.948369 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:22.948345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-r58vb" event={"ID":"421e5aad-dcec-441f-b232-eff2d7c6af79","Type":"ContainerStarted","Data":"8ea37bc537914b5e890cc89f4f1a11edda29d7c245f81646203db35e80f62d7d"} Apr 16 17:41:22.948586 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:22.948562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:22.964389 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:22.964357 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-r58vb" Apr 16 17:41:22.985501 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:22.985440 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-r58vb" podStartSLOduration=2.401883575 podStartE2EDuration="19.985417557s" podCreationTimestamp="2026-04-16 17:41:03 +0000 UTC" firstStartedPulling="2026-04-16 17:41:04.417160449 +0000 UTC m=+62.433627962" lastFinishedPulling="2026-04-16 17:41:22.000694433 +0000 UTC m=+80.017161944" observedRunningTime="2026-04-16 17:41:22.985414404 +0000 UTC m=+81.001881925" watchObservedRunningTime="2026-04-16 17:41:22.985417557 +0000 UTC m=+81.001885106" Apr 16 17:41:23.953892 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:23.953849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j556q" event={"ID":"b08bc540-ee52-405a-91b1-6b666ac80a17","Type":"ContainerStarted","Data":"5cdb96b299cc3e8a501beda81bc3d9b603ff594a5264d7dd29c62e49849426e0"} Apr 16 17:41:23.953892 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:23.953897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j556q" event={"ID":"b08bc540-ee52-405a-91b1-6b666ac80a17","Type":"ContainerStarted","Data":"2d96eee6bce3db3d074461c548a7e9763650218733f1e0348a80628ebb2c07e7"} Apr 16 17:41:23.977740 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:23.977682 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-j556q" podStartSLOduration=6.228255231 podStartE2EDuration="6.977660673s" podCreationTimestamp="2026-04-16 17:41:17 +0000 UTC" firstStartedPulling="2026-04-16 17:41:21.911009803 +0000 UTC m=+79.927477303" lastFinishedPulling="2026-04-16 17:41:22.660415121 +0000 UTC m=+80.676882745" observedRunningTime="2026-04-16 17:41:23.975546515 +0000 UTC m=+81.992014037" watchObservedRunningTime="2026-04-16 17:41:23.977660673 +0000 UTC m=+81.994128197" Apr 16 17:41:25.865811 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:25.865781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-96578b4d-66dzw" Apr 16 17:41:27.123220 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.123182 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x72lh"] Apr 16 17:41:27.159587 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.159550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x72lh"] Apr 16 17:41:27.159769 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.159708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.162797 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.162770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:41:27.162966 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.162812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:41:27.163037 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.162985 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kg8l7\"" Apr 16 17:41:27.305790 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.305753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d05422b4-dafd-47da-b046-07325a713255-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.306004 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.305839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvvf\" (UniqueName: \"kubernetes.io/projected/d05422b4-dafd-47da-b046-07325a713255-kube-api-access-wnvvf\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.306004 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.305956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d05422b4-dafd-47da-b046-07325a713255-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.306004 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.305985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d05422b4-dafd-47da-b046-07325a713255-data-volume\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.306210 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.306089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d05422b4-dafd-47da-b046-07325a713255-crio-socket\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.406702 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.406659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d05422b4-dafd-47da-b046-07325a713255-crio-socket\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.406888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.406721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d05422b4-dafd-47da-b046-07325a713255-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.406888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.406779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvvf\" (UniqueName: \"kubernetes.io/projected/d05422b4-dafd-47da-b046-07325a713255-kube-api-access-wnvvf\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.406888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.406805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d05422b4-dafd-47da-b046-07325a713255-crio-socket\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.406888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.406824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d05422b4-dafd-47da-b046-07325a713255-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.406888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.406842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d05422b4-dafd-47da-b046-07325a713255-data-volume\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.407124 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.407110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d05422b4-dafd-47da-b046-07325a713255-data-volume\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.407202 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:27.407190 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:41:27.407261 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:41:27.407247 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d05422b4-dafd-47da-b046-07325a713255-insights-runtime-extractor-tls podName:d05422b4-dafd-47da-b046-07325a713255 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:27.907229423 +0000 UTC m=+85.923696922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d05422b4-dafd-47da-b046-07325a713255-insights-runtime-extractor-tls") pod "insights-runtime-extractor-x72lh" (UID: "d05422b4-dafd-47da-b046-07325a713255") : secret "insights-runtime-extractor-tls" not found Apr 16 17:41:27.407364 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.407345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d05422b4-dafd-47da-b046-07325a713255-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.422987 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.422957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvvf\" (UniqueName: \"kubernetes.io/projected/d05422b4-dafd-47da-b046-07325a713255-kube-api-access-wnvvf\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.912011 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.911971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d05422b4-dafd-47da-b046-07325a713255-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:27.914597 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:27.914570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d05422b4-dafd-47da-b046-07325a713255-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x72lh\" (UID: \"d05422b4-dafd-47da-b046-07325a713255\") " pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:28.071681 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.071626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x72lh" Apr 16 17:41:28.216358 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.216280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x72lh"] Apr 16 17:41:28.220043 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:41:28.220008 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05422b4_dafd_47da_b046_07325a713255.slice/crio-7dded884303e59575d5f7c1f9f160e55ab24144462d410027f345001b900413c WatchSource:0}: Error finding container 7dded884303e59575d5f7c1f9f160e55ab24144462d410027f345001b900413c: Status 404 returned error can't find the container with id 7dded884303e59575d5f7c1f9f160e55ab24144462d410027f345001b900413c Apr 16 17:41:28.823144 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.823097 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57577fdccc-fltds" podUID="a3c3daf4-9be0-4443-af47-0a67097365fa" containerName="registry" containerID="cri-o://d6d6712d5f748543dc8ef283ca782bbf028b4029821001871bfbb87bb6282772" gracePeriod=30 Apr 16 17:41:28.976122 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.976027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x72lh" event={"ID":"d05422b4-dafd-47da-b046-07325a713255","Type":"ContainerStarted","Data":"ac83f1c9f398f8fbe2302a2c8dfee4066045929c7b912879364ff23295714dce"} Apr 16 17:41:28.976122 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.976081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x72lh" event={"ID":"d05422b4-dafd-47da-b046-07325a713255","Type":"ContainerStarted","Data":"7dded884303e59575d5f7c1f9f160e55ab24144462d410027f345001b900413c"} Apr 16 17:41:28.978320 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.978236 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3c3daf4-9be0-4443-af47-0a67097365fa" containerID="d6d6712d5f748543dc8ef283ca782bbf028b4029821001871bfbb87bb6282772" exitCode=0 Apr 16 17:41:28.978320 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:28.978285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57577fdccc-fltds" event={"ID":"a3c3daf4-9be0-4443-af47-0a67097365fa","Type":"ContainerDied","Data":"d6d6712d5f748543dc8ef283ca782bbf028b4029821001871bfbb87bb6282772"} Apr 16 17:41:29.181935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.181888 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:41:29.220470 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-bound-sa-token\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220484 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-trusted-ca\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220510 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220584 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-image-registry-private-configuration\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220612 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-installation-pull-secrets\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-certificates\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220685 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47fjl\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-kube-api-access-47fjl\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220710 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3c3daf4-9be0-4443-af47-0a67097365fa-ca-trust-extracted\") pod \"a3c3daf4-9be0-4443-af47-0a67097365fa\" (UID: \"a3c3daf4-9be0-4443-af47-0a67097365fa\") " Apr 16 17:41:29.220965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.220930 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:29.221369 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.221171 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:29.225204 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.225144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:41:29.225608 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.225489 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:41:29.225962 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.225851 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:29.230316 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.230116 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-kube-api-access-47fjl" (OuterVolumeSpecName: "kube-api-access-47fjl") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "kube-api-access-47fjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:29.230316 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.230159 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:29.233693 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.233656 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c3daf4-9be0-4443-af47-0a67097365fa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a3c3daf4-9be0-4443-af47-0a67097365fa" (UID: "a3c3daf4-9be0-4443-af47-0a67097365fa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:41:29.321666 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321627 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47fjl\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-kube-api-access-47fjl\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321666 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321665 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3c3daf4-9be0-4443-af47-0a67097365fa-ca-trust-extracted\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321678 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-bound-sa-token\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321691 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-trusted-ca\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321703 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-tls\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321715 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-image-registry-private-configuration\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321729 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3c3daf4-9be0-4443-af47-0a67097365fa-installation-pull-secrets\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.321954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.321771 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3c3daf4-9be0-4443-af47-0a67097365fa-registry-certificates\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:41:29.985865 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.985829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x72lh" event={"ID":"d05422b4-dafd-47da-b046-07325a713255","Type":"ContainerStarted","Data":"e42e438cd741a3c208f3cf3a7cd140a938e5501eed27de51233e843580eeb45a"} Apr 16 17:41:29.987669 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.987633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57577fdccc-fltds" event={"ID":"a3c3daf4-9be0-4443-af47-0a67097365fa","Type":"ContainerDied","Data":"bd17b89a355226014aa89f252f6f5fa8d71b95e01889d9e7bb4c8c9277bd884c"} Apr 16 17:41:29.987845 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.987693 2576 scope.go:117] "RemoveContainer" containerID="d6d6712d5f748543dc8ef283ca782bbf028b4029821001871bfbb87bb6282772" Apr 16 17:41:29.987845 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:29.987837 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57577fdccc-fltds" Apr 16 17:41:30.020351 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:30.020311 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57577fdccc-fltds"] Apr 16 17:41:30.033136 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:30.033013 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57577fdccc-fltds"] Apr 16 17:41:30.629969 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:30.629928 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c3daf4-9be0-4443-af47-0a67097365fa" path="/var/lib/kubelet/pods/a3c3daf4-9be0-4443-af47-0a67097365fa/volumes" Apr 16 17:41:31.999437 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:31.999403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x72lh" event={"ID":"d05422b4-dafd-47da-b046-07325a713255","Type":"ContainerStarted","Data":"80c9b85165e707f6c019aff0ac4ce0efde233877240ab8f744943036c9be66c6"} Apr 16 17:41:32.021611 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:32.021550 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x72lh" podStartSLOduration=1.865585702 podStartE2EDuration="5.021534359s" podCreationTimestamp="2026-04-16 17:41:27 +0000 UTC" firstStartedPulling="2026-04-16 17:41:28.356217259 +0000 UTC m=+86.372684772" lastFinishedPulling="2026-04-16 17:41:31.512165723 +0000 UTC m=+89.528633429" observedRunningTime="2026-04-16 17:41:32.02023975 +0000 UTC m=+90.036707271" watchObservedRunningTime="2026-04-16 17:41:32.021534359 +0000 UTC m=+90.038001917" Apr 16 17:41:34.656641 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:34.656608 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546f885c99-8d5wc"] Apr 16 17:41:59.675806 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:59.675761 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-546f885c99-8d5wc" podUID="67580a9a-b1fe-4921-9484-c90f744ccc62" containerName="console" containerID="cri-o://d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4" gracePeriod=15 Apr 16 17:41:59.957864 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:59.957842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546f885c99-8d5wc_67580a9a-b1fe-4921-9484-c90f744ccc62/console/0.log" Apr 16 17:41:59.958017 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:41:59.957901 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:42:00.046737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.046700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-service-ca\") pod \"67580a9a-b1fe-4921-9484-c90f744ccc62\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " Apr 16 17:42:00.046924 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.046753 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-serving-cert\") pod \"67580a9a-b1fe-4921-9484-c90f744ccc62\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " Apr 16 17:42:00.046924 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.046790 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-console-config\") pod \"67580a9a-b1fe-4921-9484-c90f744ccc62\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " Apr 16 17:42:00.047018 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.046953 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-oauth-serving-cert\") pod \"67580a9a-b1fe-4921-9484-c90f744ccc62\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " Apr 16 17:42:00.047018 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047008 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-oauth-config\") pod \"67580a9a-b1fe-4921-9484-c90f744ccc62\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " Apr 16 17:42:00.047096 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047040 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrfmt\" (UniqueName: \"kubernetes.io/projected/67580a9a-b1fe-4921-9484-c90f744ccc62-kube-api-access-lrfmt\") pod \"67580a9a-b1fe-4921-9484-c90f744ccc62\" (UID: \"67580a9a-b1fe-4921-9484-c90f744ccc62\") " Apr 16 17:42:00.047206 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047184 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-console-config" (OuterVolumeSpecName: "console-config") pod "67580a9a-b1fe-4921-9484-c90f744ccc62" (UID: "67580a9a-b1fe-4921-9484-c90f744ccc62"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:00.047268 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047205 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-service-ca" (OuterVolumeSpecName: "service-ca") pod "67580a9a-b1fe-4921-9484-c90f744ccc62" (UID: "67580a9a-b1fe-4921-9484-c90f744ccc62"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:00.047268 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047236 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "67580a9a-b1fe-4921-9484-c90f744ccc62" (UID: "67580a9a-b1fe-4921-9484-c90f744ccc62"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:00.047350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047320 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-service-ca\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:42:00.047350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047336 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-console-config\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:42:00.047417 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.047350 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67580a9a-b1fe-4921-9484-c90f744ccc62-oauth-serving-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:42:00.049204 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.049166 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67580a9a-b1fe-4921-9484-c90f744ccc62-kube-api-access-lrfmt" (OuterVolumeSpecName: "kube-api-access-lrfmt") pod "67580a9a-b1fe-4921-9484-c90f744ccc62" (UID: "67580a9a-b1fe-4921-9484-c90f744ccc62"). InnerVolumeSpecName "kube-api-access-lrfmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:42:00.049297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.049204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "67580a9a-b1fe-4921-9484-c90f744ccc62" (UID: "67580a9a-b1fe-4921-9484-c90f744ccc62"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:00.049297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.049249 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "67580a9a-b1fe-4921-9484-c90f744ccc62" (UID: "67580a9a-b1fe-4921-9484-c90f744ccc62"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:00.080783 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.080758 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546f885c99-8d5wc_67580a9a-b1fe-4921-9484-c90f744ccc62/console/0.log" Apr 16 17:42:00.080964 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.080797 2576 generic.go:358] "Generic (PLEG): container finished" podID="67580a9a-b1fe-4921-9484-c90f744ccc62" containerID="d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4" exitCode=2 Apr 16 17:42:00.080964 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.080827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546f885c99-8d5wc" event={"ID":"67580a9a-b1fe-4921-9484-c90f744ccc62","Type":"ContainerDied","Data":"d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4"} Apr 16 17:42:00.080964 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.080865 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546f885c99-8d5wc" Apr 16 17:42:00.080964 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.080881 2576 scope.go:117] "RemoveContainer" containerID="d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4" Apr 16 17:42:00.081124 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.080871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546f885c99-8d5wc" event={"ID":"67580a9a-b1fe-4921-9484-c90f744ccc62","Type":"ContainerDied","Data":"3037168672db451ad5b1fc4c86bd3eca8de1f8b543ea650077774eed196b53bf"} Apr 16 17:42:00.088872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.088837 2576 scope.go:117] "RemoveContainer" containerID="d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4" Apr 16 17:42:00.089329 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:42:00.089301 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4\": container with ID starting with d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4 not found: ID does not exist" containerID="d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4" Apr 16 17:42:00.089415 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.089333 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4"} err="failed to get container status \"d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4\": rpc error: code = NotFound desc = could not find container \"d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4\": container with ID starting with d3b7574ce24bc6b49c08cee062a8b730a3c715807facf4b41d304cd620f77bd4 not found: ID does not exist" Apr 16 17:42:00.104350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.104330 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546f885c99-8d5wc"] Apr 16 17:42:00.109317 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.109286 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-546f885c99-8d5wc"] Apr 16 17:42:00.147717 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.147688 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-oauth-config\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:42:00.147717 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.147713 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrfmt\" (UniqueName: \"kubernetes.io/projected/67580a9a-b1fe-4921-9484-c90f744ccc62-kube-api-access-lrfmt\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:42:00.147717 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.147725 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67580a9a-b1fe-4921-9484-c90f744ccc62-console-serving-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:42:00.626146 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:00.626116 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67580a9a-b1fe-4921-9484-c90f744ccc62" path="/var/lib/kubelet/pods/67580a9a-b1fe-4921-9484-c90f744ccc62/volumes" Apr 16 17:42:37.180353 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:37.180327 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:42:37.180728 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:37.180368 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b5a2170-b541-4c6d-918a-c836b3286e61" containerID="e8d4cb2129d9e53e092c73a307468e25be52286ffaed02cc6b301276c25ea844" exitCode=2 Apr 16 17:42:37.180728 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:37.180427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" event={"ID":"1b5a2170-b541-4c6d-918a-c836b3286e61","Type":"ContainerDied","Data":"e8d4cb2129d9e53e092c73a307468e25be52286ffaed02cc6b301276c25ea844"} Apr 16 17:42:37.180799 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:37.180756 2576 scope.go:117] "RemoveContainer" containerID="e8d4cb2129d9e53e092c73a307468e25be52286ffaed02cc6b301276c25ea844" Apr 16 17:42:38.185116 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:38.185086 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:42:38.185506 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:38.185173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-gvbhc" event={"ID":"1b5a2170-b541-4c6d-918a-c836b3286e61","Type":"ContainerStarted","Data":"66048615ff2315cae14e0d3669a3cc7f64e9871543116c9e0ecea5e6960070d2"} Apr 16 17:42:49.535750 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.535714 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79449ffdc4-ts2k5"] Apr 16 17:42:49.536256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.535989 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67580a9a-b1fe-4921-9484-c90f744ccc62" containerName="console" Apr 16 17:42:49.536256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.536001 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="67580a9a-b1fe-4921-9484-c90f744ccc62" containerName="console" Apr 16 17:42:49.536256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.536014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c3daf4-9be0-4443-af47-0a67097365fa" containerName="registry" Apr 16 17:42:49.536256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.536020 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c3daf4-9be0-4443-af47-0a67097365fa" containerName="registry" Apr 16 17:42:49.536256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.536070 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3c3daf4-9be0-4443-af47-0a67097365fa" containerName="registry" Apr 16 17:42:49.536256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.536080 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="67580a9a-b1fe-4921-9484-c90f744ccc62" containerName="console" Apr 16 17:42:49.538990 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.538973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.542307 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.542286 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 17:42:49.542873 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.542851 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 17:42:49.543448 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.543429 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 17:42:49.543597 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.543580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 17:42:49.545071 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.545038 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 17:42:49.545169 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.545091 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lwftg\"" Apr 16 17:42:49.552274 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.552251 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 17:42:49.553450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.553422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79449ffdc4-ts2k5"] Apr 16 17:42:49.613534 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-service-ca\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.613534 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-serving-cert\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.613758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-trusted-ca-bundle\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.613758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-oauth-serving-cert\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.613758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjd6\" (UniqueName: \"kubernetes.io/projected/422b78ab-ae14-4534-93f5-b2d8b0678cbc-kube-api-access-mgjd6\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.613758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-oauth-config\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.613758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.613730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-config\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714206 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-trusted-ca-bundle\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714206 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-oauth-serving-cert\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjd6\" (UniqueName: \"kubernetes.io/projected/422b78ab-ae14-4534-93f5-b2d8b0678cbc-kube-api-access-mgjd6\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-oauth-config\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-config\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-service-ca\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.714419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-serving-cert\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.715020 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.714990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-oauth-serving-cert\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.715132 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.715054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-config\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.715132 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.715107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-service-ca\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.715234 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.715219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-trusted-ca-bundle\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.716793 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.716765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-oauth-config\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.716916 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.716886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-serving-cert\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.724469 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.724447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjd6\" (UniqueName: \"kubernetes.io/projected/422b78ab-ae14-4534-93f5-b2d8b0678cbc-kube-api-access-mgjd6\") pod \"console-79449ffdc4-ts2k5\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.848383 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.848291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:42:49.974303 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:49.974267 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79449ffdc4-ts2k5"] Apr 16 17:42:49.976275 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:42:49.976248 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod422b78ab_ae14_4534_93f5_b2d8b0678cbc.slice/crio-888b6944a25cf6db9ab09b960a073884b93becd2505ff462c822bf8584af84ec WatchSource:0}: Error finding container 888b6944a25cf6db9ab09b960a073884b93becd2505ff462c822bf8584af84ec: Status 404 returned error can't find the container with id 888b6944a25cf6db9ab09b960a073884b93becd2505ff462c822bf8584af84ec Apr 16 17:42:50.217251 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:50.217211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79449ffdc4-ts2k5" event={"ID":"422b78ab-ae14-4534-93f5-b2d8b0678cbc","Type":"ContainerStarted","Data":"321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b"} Apr 16 17:42:50.217251 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:50.217253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79449ffdc4-ts2k5" event={"ID":"422b78ab-ae14-4534-93f5-b2d8b0678cbc","Type":"ContainerStarted","Data":"888b6944a25cf6db9ab09b960a073884b93becd2505ff462c822bf8584af84ec"} Apr 16 17:42:50.242611 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:50.242551 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79449ffdc4-ts2k5" podStartSLOduration=1.24253421 podStartE2EDuration="1.24253421s" podCreationTimestamp="2026-04-16 17:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:42:50.24137753 +0000 UTC m=+168.257845043" watchObservedRunningTime="2026-04-16 17:42:50.24253421 +0000 UTC m=+168.259001738" Apr 16 17:42:57.045412 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.045325 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79449ffdc4-ts2k5"] Apr 16 17:42:57.094863 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.092709 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c75c85d46-ws7st"] Apr 16 17:42:57.097105 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.097085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.106153 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.106127 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c75c85d46-ws7st"] Apr 16 17:42:57.270287 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-serving-cert\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.270287 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-oauth-serving-cert\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.270503 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gsq\" (UniqueName: \"kubernetes.io/projected/125afcbc-3390-4acd-81fb-07bc14feec8c-kube-api-access-55gsq\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.270503 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-console-config\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.270503 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-service-ca\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.270618 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-trusted-ca-bundle\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.270618 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.270530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-oauth-config\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-service-ca\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-trusted-ca-bundle\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-oauth-config\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371669 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-serving-cert\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371669 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-oauth-serving-cert\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371669 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55gsq\" (UniqueName: \"kubernetes.io/projected/125afcbc-3390-4acd-81fb-07bc14feec8c-kube-api-access-55gsq\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.371669 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.371467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-console-config\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.372248 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.372220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-service-ca\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.372248 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.372237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-oauth-serving-cert\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.372393 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.372312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-console-config\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.372393 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.372375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-trusted-ca-bundle\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.373862 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.373838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-oauth-config\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.373979 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.373959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-serving-cert\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.385525 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.385498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gsq\" (UniqueName: \"kubernetes.io/projected/125afcbc-3390-4acd-81fb-07bc14feec8c-kube-api-access-55gsq\") pod \"console-7c75c85d46-ws7st\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.407042 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.407013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:42:57.533326 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:57.533300 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c75c85d46-ws7st"] Apr 16 17:42:57.535834 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:42:57.535807 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod125afcbc_3390_4acd_81fb_07bc14feec8c.slice/crio-f2bfc0231ea16ad39a88f674d95de14358e6085d29636c0aaa11019e02572f4d WatchSource:0}: Error finding container f2bfc0231ea16ad39a88f674d95de14358e6085d29636c0aaa11019e02572f4d: Status 404 returned error can't find the container with id f2bfc0231ea16ad39a88f674d95de14358e6085d29636c0aaa11019e02572f4d Apr 16 17:42:58.240357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:58.240320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c75c85d46-ws7st" event={"ID":"125afcbc-3390-4acd-81fb-07bc14feec8c","Type":"ContainerStarted","Data":"4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f"} Apr 16 17:42:58.240728 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:58.240367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c75c85d46-ws7st" event={"ID":"125afcbc-3390-4acd-81fb-07bc14feec8c","Type":"ContainerStarted","Data":"f2bfc0231ea16ad39a88f674d95de14358e6085d29636c0aaa11019e02572f4d"} Apr 16 17:42:58.262750 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:58.262699 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c75c85d46-ws7st" podStartSLOduration=1.262684464 podStartE2EDuration="1.262684464s" podCreationTimestamp="2026-04-16 17:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:42:58.260397744 +0000 UTC m=+176.276865265" watchObservedRunningTime="2026-04-16 17:42:58.262684464 +0000 UTC m=+176.279151996" Apr 16 17:42:59.848686 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:42:59.848645 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:43:07.408114 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:07.408059 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:43:07.408114 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:07.408112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:43:07.412888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:07.412860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:43:08.271129 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:08.271103 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:43:22.063744 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.063675 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79449ffdc4-ts2k5" podUID="422b78ab-ae14-4534-93f5-b2d8b0678cbc" containerName="console" containerID="cri-o://321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b" gracePeriod=15 Apr 16 17:43:22.305324 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.305303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79449ffdc4-ts2k5_422b78ab-ae14-4534-93f5-b2d8b0678cbc/console/0.log" Apr 16 17:43:22.305477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.305374 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:43:22.306080 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.306062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79449ffdc4-ts2k5_422b78ab-ae14-4534-93f5-b2d8b0678cbc/console/0.log" Apr 16 17:43:22.306152 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.306099 2576 generic.go:358] "Generic (PLEG): container finished" podID="422b78ab-ae14-4534-93f5-b2d8b0678cbc" containerID="321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b" exitCode=2 Apr 16 17:43:22.306186 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.306154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79449ffdc4-ts2k5" event={"ID":"422b78ab-ae14-4534-93f5-b2d8b0678cbc","Type":"ContainerDied","Data":"321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b"} Apr 16 17:43:22.306219 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.306187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79449ffdc4-ts2k5" event={"ID":"422b78ab-ae14-4534-93f5-b2d8b0678cbc","Type":"ContainerDied","Data":"888b6944a25cf6db9ab09b960a073884b93becd2505ff462c822bf8584af84ec"} Apr 16 17:43:22.306219 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.306204 2576 scope.go:117] "RemoveContainer" containerID="321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b" Apr 16 17:43:22.313237 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.313219 2576 scope.go:117] "RemoveContainer" containerID="321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b" Apr 16 17:43:22.313494 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:43:22.313475 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b\": container with ID starting with 321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b not found: ID does not exist" containerID="321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b" Apr 16 17:43:22.313561 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.313501 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b"} err="failed to get container status \"321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b\": rpc error: code = NotFound desc = could not find container \"321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b\": container with ID starting with 321ae1405209f05ee94b6971c6c2f44b5f4cd162c8b78eb150d5a527d877136b not found: ID does not exist" Apr 16 17:43:22.366350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366245 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-oauth-serving-cert\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366301 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-config\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366350 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366330 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-oauth-config\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366626 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366356 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-trusted-ca-bundle\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366626 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366383 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-serving-cert\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366626 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366409 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgjd6\" (UniqueName: \"kubernetes.io/projected/422b78ab-ae14-4534-93f5-b2d8b0678cbc-kube-api-access-mgjd6\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366626 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366461 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-service-ca\") pod \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\" (UID: \"422b78ab-ae14-4534-93f5-b2d8b0678cbc\") " Apr 16 17:43:22.366782 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366748 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:22.366937 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366852 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-config" (OuterVolumeSpecName: "console-config") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:22.367013 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.366951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-service-ca" (OuterVolumeSpecName: "service-ca") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:22.367052 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.367023 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:22.368621 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.368598 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:43:22.368723 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.368623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:43:22.368723 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.368666 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422b78ab-ae14-4534-93f5-b2d8b0678cbc-kube-api-access-mgjd6" (OuterVolumeSpecName: "kube-api-access-mgjd6") pod "422b78ab-ae14-4534-93f5-b2d8b0678cbc" (UID: "422b78ab-ae14-4534-93f5-b2d8b0678cbc"). InnerVolumeSpecName "kube-api-access-mgjd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:43:22.467765 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467727 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-oauth-serving-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:22.467765 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467759 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-config\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:22.467765 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467770 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-oauth-config\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:22.468026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467780 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-trusted-ca-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:22.468026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467789 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/422b78ab-ae14-4534-93f5-b2d8b0678cbc-console-serving-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:22.468026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467798 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgjd6\" (UniqueName: \"kubernetes.io/projected/422b78ab-ae14-4534-93f5-b2d8b0678cbc-kube-api-access-mgjd6\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:22.468026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:22.467808 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/422b78ab-ae14-4534-93f5-b2d8b0678cbc-service-ca\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:43:23.309524 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:23.309497 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79449ffdc4-ts2k5" Apr 16 17:43:23.331963 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:23.331930 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79449ffdc4-ts2k5"] Apr 16 17:43:23.336864 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:23.336835 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79449ffdc4-ts2k5"] Apr 16 17:43:24.626388 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:43:24.626350 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="422b78ab-ae14-4534-93f5-b2d8b0678cbc" path="/var/lib/kubelet/pods/422b78ab-ae14-4534-93f5-b2d8b0678cbc/volumes" Apr 16 17:45:02.490357 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:45:02.490324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:45:02.490893 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:45:02.490324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:45:02.500026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:45:02.500003 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:47:57.678652 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.678570 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d867bf5b5-fp8rs"] Apr 16 17:47:57.679125 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.678845 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="422b78ab-ae14-4534-93f5-b2d8b0678cbc" containerName="console" Apr 16 17:47:57.679125 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.678858 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="422b78ab-ae14-4534-93f5-b2d8b0678cbc" containerName="console" Apr 16 17:47:57.679125 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.678929 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="422b78ab-ae14-4534-93f5-b2d8b0678cbc" containerName="console" Apr 16 17:47:57.681660 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.681642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.703133 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.703102 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d867bf5b5-fp8rs"] Apr 16 17:47:57.811425 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d6887bc-23dd-449c-aecc-8adedad8c860-console-oauth-config\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.811425 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-service-ca\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.811634 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-console-config\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.811634 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6887bc-23dd-449c-aecc-8adedad8c860-console-serving-cert\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.811634 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-trusted-ca-bundle\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.811634 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvfh\" (UniqueName: \"kubernetes.io/projected/1d6887bc-23dd-449c-aecc-8adedad8c860-kube-api-access-bwvfh\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.811758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.811636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-oauth-serving-cert\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.912791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.912756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-oauth-serving-cert\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.912892 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.912809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d6887bc-23dd-449c-aecc-8adedad8c860-console-oauth-config\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.912892 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.912826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-service-ca\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913002 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.912953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-console-config\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913054 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6887bc-23dd-449c-aecc-8adedad8c860-console-serving-cert\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913054 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-trusted-ca-bundle\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913150 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvfh\" (UniqueName: \"kubernetes.io/projected/1d6887bc-23dd-449c-aecc-8adedad8c860-kube-api-access-bwvfh\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913584 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-oauth-serving-cert\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913584 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-service-ca\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.913736 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-console-config\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.914009 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.913985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6887bc-23dd-449c-aecc-8adedad8c860-trusted-ca-bundle\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.915314 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.915296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d6887bc-23dd-449c-aecc-8adedad8c860-console-oauth-config\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.915550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.915533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6887bc-23dd-449c-aecc-8adedad8c860-console-serving-cert\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.923476 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.923445 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvfh\" (UniqueName: \"kubernetes.io/projected/1d6887bc-23dd-449c-aecc-8adedad8c860-kube-api-access-bwvfh\") pod \"console-6d867bf5b5-fp8rs\" (UID: \"1d6887bc-23dd-449c-aecc-8adedad8c860\") " pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:57.990478 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:57.990385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:47:58.123689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:58.123656 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d867bf5b5-fp8rs"] Apr 16 17:47:58.126702 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:47:58.126675 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6887bc_23dd_449c_aecc_8adedad8c860.slice/crio-eed32fa900494c188721fa6399bf608a6ba14064764bda0e090bd521915a09d2 WatchSource:0}: Error finding container eed32fa900494c188721fa6399bf608a6ba14064764bda0e090bd521915a09d2: Status 404 returned error can't find the container with id eed32fa900494c188721fa6399bf608a6ba14064764bda0e090bd521915a09d2 Apr 16 17:47:58.128449 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:58.128433 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:47:59.056587 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:59.056553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d867bf5b5-fp8rs" event={"ID":"1d6887bc-23dd-449c-aecc-8adedad8c860","Type":"ContainerStarted","Data":"2811f04246d679031c1dfbaf2fb512a3b4e886c6e461a301179f780463296e09"} Apr 16 17:47:59.056587 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:59.056588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d867bf5b5-fp8rs" event={"ID":"1d6887bc-23dd-449c-aecc-8adedad8c860","Type":"ContainerStarted","Data":"eed32fa900494c188721fa6399bf608a6ba14064764bda0e090bd521915a09d2"} Apr 16 17:47:59.080457 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:47:59.080409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d867bf5b5-fp8rs" podStartSLOduration=2.080396029 podStartE2EDuration="2.080396029s" podCreationTimestamp="2026-04-16 17:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:47:59.07841161 +0000 UTC m=+477.094879130" watchObservedRunningTime="2026-04-16 17:47:59.080396029 +0000 UTC m=+477.096863564" Apr 16 17:48:07.991422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:07.991325 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:48:07.991422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:07.991369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:48:07.996244 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:07.996219 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:48:08.088365 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:08.088326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d867bf5b5-fp8rs" Apr 16 17:48:08.162198 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:08.162164 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c75c85d46-ws7st"] Apr 16 17:48:33.184147 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.184083 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c75c85d46-ws7st" podUID="125afcbc-3390-4acd-81fb-07bc14feec8c" containerName="console" containerID="cri-o://4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f" gracePeriod=15 Apr 16 17:48:33.417600 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.417577 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c75c85d46-ws7st_125afcbc-3390-4acd-81fb-07bc14feec8c/console/0.log" Apr 16 17:48:33.417711 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.417638 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:48:33.478946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.478817 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55gsq\" (UniqueName: \"kubernetes.io/projected/125afcbc-3390-4acd-81fb-07bc14feec8c-kube-api-access-55gsq\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.478946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.478879 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-service-ca\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.479180 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.478950 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-oauth-serving-cert\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.479180 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.478975 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-console-config\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.479180 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479004 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-oauth-config\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.479180 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-serving-cert\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.479180 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479070 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-trusted-ca-bundle\") pod \"125afcbc-3390-4acd-81fb-07bc14feec8c\" (UID: \"125afcbc-3390-4acd-81fb-07bc14feec8c\") " Apr 16 17:48:33.479526 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479408 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-console-config" (OuterVolumeSpecName: "console-config") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:48:33.479605 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479518 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:48:33.479605 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-service-ca" (OuterVolumeSpecName: "service-ca") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:48:33.479698 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.479599 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:48:33.481087 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.481066 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:48:33.481461 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.481440 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:48:33.481527 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.481499 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125afcbc-3390-4acd-81fb-07bc14feec8c-kube-api-access-55gsq" (OuterVolumeSpecName: "kube-api-access-55gsq") pod "125afcbc-3390-4acd-81fb-07bc14feec8c" (UID: "125afcbc-3390-4acd-81fb-07bc14feec8c"). InnerVolumeSpecName "kube-api-access-55gsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:48:33.579635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579580 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-trusted-ca-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:33.579635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579624 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55gsq\" (UniqueName: \"kubernetes.io/projected/125afcbc-3390-4acd-81fb-07bc14feec8c-kube-api-access-55gsq\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:33.579635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579635 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-service-ca\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:33.579635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579646 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-oauth-serving-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:33.579635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579655 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/125afcbc-3390-4acd-81fb-07bc14feec8c-console-config\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:33.579971 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579664 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-oauth-config\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:33.579971 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:33.579672 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/125afcbc-3390-4acd-81fb-07bc14feec8c-console-serving-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:48:34.151151 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.151122 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c75c85d46-ws7st_125afcbc-3390-4acd-81fb-07bc14feec8c/console/0.log" Apr 16 17:48:34.151293 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.151161 2576 generic.go:358] "Generic (PLEG): container finished" podID="125afcbc-3390-4acd-81fb-07bc14feec8c" containerID="4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f" exitCode=2 Apr 16 17:48:34.151293 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.151194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c75c85d46-ws7st" event={"ID":"125afcbc-3390-4acd-81fb-07bc14feec8c","Type":"ContainerDied","Data":"4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f"} Apr 16 17:48:34.151293 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.151233 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c75c85d46-ws7st" Apr 16 17:48:34.151293 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.151241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c75c85d46-ws7st" event={"ID":"125afcbc-3390-4acd-81fb-07bc14feec8c","Type":"ContainerDied","Data":"f2bfc0231ea16ad39a88f674d95de14358e6085d29636c0aaa11019e02572f4d"} Apr 16 17:48:34.151293 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.151260 2576 scope.go:117] "RemoveContainer" containerID="4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f" Apr 16 17:48:34.159343 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.159236 2576 scope.go:117] "RemoveContainer" containerID="4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f" Apr 16 17:48:34.159634 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:48:34.159602 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f\": container with ID starting with 4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f not found: ID does not exist" containerID="4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f" Apr 16 17:48:34.159709 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.159638 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f"} err="failed to get container status \"4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f\": rpc error: code = NotFound desc = could not find container \"4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f\": container with ID starting with 4e0b0d0369ff4a047d1c9b163722316c3c42a33159984b06ee22c4b6b1e1f22f not found: ID does not exist" Apr 16 17:48:34.175445 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.175413 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c75c85d46-ws7st"] Apr 16 17:48:34.183247 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.183216 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c75c85d46-ws7st"] Apr 16 17:48:34.630747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:34.626881 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125afcbc-3390-4acd-81fb-07bc14feec8c" path="/var/lib/kubelet/pods/125afcbc-3390-4acd-81fb-07bc14feec8c/volumes" Apr 16 17:48:58.442389 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.442308 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr"] Apr 16 17:48:58.442777 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.442594 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="125afcbc-3390-4acd-81fb-07bc14feec8c" containerName="console" Apr 16 17:48:58.442777 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.442605 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="125afcbc-3390-4acd-81fb-07bc14feec8c" containerName="console" Apr 16 17:48:58.442777 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.442647 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="125afcbc-3390-4acd-81fb-07bc14feec8c" containerName="console" Apr 16 17:48:58.445415 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.445391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.448028 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.447996 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l58bq\"" Apr 16 17:48:58.448160 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.448036 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:48:58.448160 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.448052 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:48:58.454825 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.454805 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr"] Apr 16 17:48:58.574666 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.574622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.574666 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.574667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.574885 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.574793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlfs6\" (UniqueName: \"kubernetes.io/projected/2cd54844-dda4-4b6d-bc81-44634b03bca1-kube-api-access-tlfs6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.675297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.675253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlfs6\" (UniqueName: \"kubernetes.io/projected/2cd54844-dda4-4b6d-bc81-44634b03bca1-kube-api-access-tlfs6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.675297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.675299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.675557 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.675328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.675747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.675726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.675747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.675740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.685865 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.685824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlfs6\" (UniqueName: \"kubernetes.io/projected/2cd54844-dda4-4b6d-bc81-44634b03bca1-kube-api-access-tlfs6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.754405 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.754323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:48:58.888546 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:58.888517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr"] Apr 16 17:48:58.890553 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:48:58.890515 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd54844_dda4_4b6d_bc81_44634b03bca1.slice/crio-2bdeaa717360f9277b23fb01a031ac1690c6f538390796accfbe7d08b8ee6793 WatchSource:0}: Error finding container 2bdeaa717360f9277b23fb01a031ac1690c6f538390796accfbe7d08b8ee6793: Status 404 returned error can't find the container with id 2bdeaa717360f9277b23fb01a031ac1690c6f538390796accfbe7d08b8ee6793 Apr 16 17:48:59.224079 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:48:59.224034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" event={"ID":"2cd54844-dda4-4b6d-bc81-44634b03bca1","Type":"ContainerStarted","Data":"2bdeaa717360f9277b23fb01a031ac1690c6f538390796accfbe7d08b8ee6793"} Apr 16 17:49:04.239923 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:04.239868 2576 generic.go:358] "Generic (PLEG): container finished" podID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerID="73aad9fa2dbdaf8ccbc022e726cdb85407c7e84f7ed838d4c0a9b27b4790dcf6" exitCode=0 Apr 16 17:49:04.240401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:04.239919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" event={"ID":"2cd54844-dda4-4b6d-bc81-44634b03bca1","Type":"ContainerDied","Data":"73aad9fa2dbdaf8ccbc022e726cdb85407c7e84f7ed838d4c0a9b27b4790dcf6"} Apr 16 17:49:06.248995 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:06.248957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" event={"ID":"2cd54844-dda4-4b6d-bc81-44634b03bca1","Type":"ContainerStarted","Data":"0c00893137c6239720aad78fba7d7955994b21b3e6340586d3e9c74b93048cc8"} Apr 16 17:49:07.253356 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:07.253319 2576 generic.go:358] "Generic (PLEG): container finished" podID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerID="0c00893137c6239720aad78fba7d7955994b21b3e6340586d3e9c74b93048cc8" exitCode=0 Apr 16 17:49:07.253739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:07.253422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" event={"ID":"2cd54844-dda4-4b6d-bc81-44634b03bca1","Type":"ContainerDied","Data":"0c00893137c6239720aad78fba7d7955994b21b3e6340586d3e9c74b93048cc8"} Apr 16 17:49:13.273527 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:13.273495 2576 generic.go:358] "Generic (PLEG): container finished" podID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerID="6ec65e97a659bf95e6a31911533c2091ab4e85d7902f0beb9a47b828b1c2c13e" exitCode=0 Apr 16 17:49:13.273896 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:13.273565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" event={"ID":"2cd54844-dda4-4b6d-bc81-44634b03bca1","Type":"ContainerDied","Data":"6ec65e97a659bf95e6a31911533c2091ab4e85d7902f0beb9a47b828b1c2c13e"} Apr 16 17:49:14.394421 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.394398 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:49:14.508590 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.508547 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlfs6\" (UniqueName: \"kubernetes.io/projected/2cd54844-dda4-4b6d-bc81-44634b03bca1-kube-api-access-tlfs6\") pod \"2cd54844-dda4-4b6d-bc81-44634b03bca1\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " Apr 16 17:49:14.508772 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.508614 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-util\") pod \"2cd54844-dda4-4b6d-bc81-44634b03bca1\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " Apr 16 17:49:14.508772 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.508644 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-bundle\") pod \"2cd54844-dda4-4b6d-bc81-44634b03bca1\" (UID: \"2cd54844-dda4-4b6d-bc81-44634b03bca1\") " Apr 16 17:49:14.509307 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.509279 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-bundle" (OuterVolumeSpecName: "bundle") pod "2cd54844-dda4-4b6d-bc81-44634b03bca1" (UID: "2cd54844-dda4-4b6d-bc81-44634b03bca1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:49:14.510786 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.510760 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd54844-dda4-4b6d-bc81-44634b03bca1-kube-api-access-tlfs6" (OuterVolumeSpecName: "kube-api-access-tlfs6") pod "2cd54844-dda4-4b6d-bc81-44634b03bca1" (UID: "2cd54844-dda4-4b6d-bc81-44634b03bca1"). InnerVolumeSpecName "kube-api-access-tlfs6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:49:14.512823 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.512801 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-util" (OuterVolumeSpecName: "util") pod "2cd54844-dda4-4b6d-bc81-44634b03bca1" (UID: "2cd54844-dda4-4b6d-bc81-44634b03bca1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:49:14.609235 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.609155 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tlfs6\" (UniqueName: \"kubernetes.io/projected/2cd54844-dda4-4b6d-bc81-44634b03bca1-kube-api-access-tlfs6\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:49:14.609235 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.609183 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:49:14.609235 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:14.609192 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd54844-dda4-4b6d-bc81-44634b03bca1-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:49:15.280593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:15.280568 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" Apr 16 17:49:15.280761 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:15.280566 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvmskr" event={"ID":"2cd54844-dda4-4b6d-bc81-44634b03bca1","Type":"ContainerDied","Data":"2bdeaa717360f9277b23fb01a031ac1690c6f538390796accfbe7d08b8ee6793"} Apr 16 17:49:15.280761 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:15.280671 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bdeaa717360f9277b23fb01a031ac1690c6f538390796accfbe7d08b8ee6793" Apr 16 17:49:20.360067 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360032 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs"] Apr 16 17:49:20.360450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360311 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="extract" Apr 16 17:49:20.360450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360328 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="extract" Apr 16 17:49:20.360450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360346 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="pull" Apr 16 17:49:20.360450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360354 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="pull" Apr 16 17:49:20.360450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360370 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="util" Apr 16 17:49:20.360450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360378 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="util" Apr 16 17:49:20.360633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.360453 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cd54844-dda4-4b6d-bc81-44634b03bca1" containerName="extract" Apr 16 17:49:20.399798 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.399761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs"] Apr 16 17:49:20.399999 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.399892 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.402786 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.402759 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 17:49:20.402946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.402876 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 17:49:20.402946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.402884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 17:49:20.402946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.402925 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-f29x4\"" Apr 16 17:49:20.559218 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.559174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/537d71d3-b6f7-4259-a666-6bbc67813630-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs\" (UID: \"537d71d3-b6f7-4259-a666-6bbc67813630\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.559402 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.559225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsr4k\" (UniqueName: \"kubernetes.io/projected/537d71d3-b6f7-4259-a666-6bbc67813630-kube-api-access-lsr4k\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs\" (UID: \"537d71d3-b6f7-4259-a666-6bbc67813630\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.659709 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.659674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/537d71d3-b6f7-4259-a666-6bbc67813630-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs\" (UID: \"537d71d3-b6f7-4259-a666-6bbc67813630\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.659935 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.659724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr4k\" (UniqueName: \"kubernetes.io/projected/537d71d3-b6f7-4259-a666-6bbc67813630-kube-api-access-lsr4k\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs\" (UID: \"537d71d3-b6f7-4259-a666-6bbc67813630\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.662110 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.662083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/537d71d3-b6f7-4259-a666-6bbc67813630-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs\" (UID: \"537d71d3-b6f7-4259-a666-6bbc67813630\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.670828 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.670797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr4k\" (UniqueName: \"kubernetes.io/projected/537d71d3-b6f7-4259-a666-6bbc67813630-kube-api-access-lsr4k\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs\" (UID: \"537d71d3-b6f7-4259-a666-6bbc67813630\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.709869 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.709825 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:20.842101 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:20.842062 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs"] Apr 16 17:49:20.845443 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:49:20.845415 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod537d71d3_b6f7_4259_a666_6bbc67813630.slice/crio-89e0187c4ccc67fcc6a602f0836c75f0cca4b4ab61d8338b13905713a9c4ff1c WatchSource:0}: Error finding container 89e0187c4ccc67fcc6a602f0836c75f0cca4b4ab61d8338b13905713a9c4ff1c: Status 404 returned error can't find the container with id 89e0187c4ccc67fcc6a602f0836c75f0cca4b4ab61d8338b13905713a9c4ff1c Apr 16 17:49:21.300615 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:21.300582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" event={"ID":"537d71d3-b6f7-4259-a666-6bbc67813630","Type":"ContainerStarted","Data":"89e0187c4ccc67fcc6a602f0836c75f0cca4b4ab61d8338b13905713a9c4ff1c"} Apr 16 17:49:26.222347 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.222309 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-56mt2"] Apr 16 17:49:26.245267 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.245231 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-56mt2"] Apr 16 17:49:26.245426 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.245391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.248255 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.248231 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 17:49:26.248548 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.248531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rggvx\"" Apr 16 17:49:26.248890 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.248873 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 17:49:26.305394 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.305352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.305571 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.305436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6gx\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-kube-api-access-rr6gx\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.305571 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.305492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/90115d25-0a9d-4281-b290-85136d9334cd-cabundle0\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.319009 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.318970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" event={"ID":"537d71d3-b6f7-4259-a666-6bbc67813630","Type":"ContainerStarted","Data":"2b59697ef0dd9e89d28526ba71fdbb092c29a687b203e221bce1201500ab60c3"} Apr 16 17:49:26.319177 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.319045 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:26.342599 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.342420 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" podStartSLOduration=1.619275337 podStartE2EDuration="6.342400946s" podCreationTimestamp="2026-04-16 17:49:20 +0000 UTC" firstStartedPulling="2026-04-16 17:49:20.847101711 +0000 UTC m=+558.863569211" lastFinishedPulling="2026-04-16 17:49:25.570227317 +0000 UTC m=+563.586694820" observedRunningTime="2026-04-16 17:49:26.340893997 +0000 UTC m=+564.357361519" watchObservedRunningTime="2026-04-16 17:49:26.342400946 +0000 UTC m=+564.358868467" Apr 16 17:49:26.406635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.406583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.406635 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.406642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6gx\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-kube-api-access-rr6gx\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.406898 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.406679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/90115d25-0a9d-4281-b290-85136d9334cd-cabundle0\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.406898 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.406802 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 17:49:26.406898 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.406820 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 17:49:26.406898 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.406830 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56mt2: references non-existent secret key: ca.crt Apr 16 17:49:26.407128 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.406896 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates podName:90115d25-0a9d-4281-b290-85136d9334cd nodeName:}" failed. No retries permitted until 2026-04-16 17:49:26.906875136 +0000 UTC m=+564.923342649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates") pod "keda-operator-ffbb595cb-56mt2" (UID: "90115d25-0a9d-4281-b290-85136d9334cd") : references non-existent secret key: ca.crt Apr 16 17:49:26.407380 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.407360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/90115d25-0a9d-4281-b290-85136d9334cd-cabundle0\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.421490 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.421458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6gx\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-kube-api-access-rr6gx\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.560611 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.560523 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9"] Apr 16 17:49:26.584714 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.584676 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9"] Apr 16 17:49:26.584897 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.584835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.587761 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.587734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 17:49:26.608055 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.608025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxt66\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-kube-api-access-bxt66\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.608221 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.608086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.612599 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.608492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.708875 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.708837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.708875 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.708881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.709094 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.709026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxt66\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-kube-api-access-bxt66\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.709176 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.709158 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:49:26.709210 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.709181 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:49:26.709210 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.709205 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9: references non-existent secret key: tls.crt Apr 16 17:49:26.709272 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.709218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.709272 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.709265 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates podName:5ec126c2-df44-4db9-a1c7-9fd3b94574ae nodeName:}" failed. No retries permitted until 2026-04-16 17:49:27.209246565 +0000 UTC m=+565.225714073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates") pod "keda-metrics-apiserver-7c9f485588-6prp9" (UID: "5ec126c2-df44-4db9-a1c7-9fd3b94574ae") : references non-existent secret key: tls.crt Apr 16 17:49:26.723161 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.723124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxt66\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-kube-api-access-bxt66\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:26.846860 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.846773 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-46qqw"] Apr 16 17:49:26.871336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.871299 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-46qqw"] Apr 16 17:49:26.871486 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.871445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:26.874173 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.874142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 17:49:26.910256 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.910214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qb2r\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-kube-api-access-9qb2r\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:26.910435 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.910292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-certificates\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:26.910435 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:26.910388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:26.910509 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.910460 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 17:49:26.910509 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.910476 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 17:49:26.910509 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.910485 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56mt2: references non-existent secret key: ca.crt Apr 16 17:49:26.910608 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:26.910531 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates podName:90115d25-0a9d-4281-b290-85136d9334cd nodeName:}" failed. No retries permitted until 2026-04-16 17:49:27.910517764 +0000 UTC m=+565.926985264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates") pod "keda-operator-ffbb595cb-56mt2" (UID: "90115d25-0a9d-4281-b290-85136d9334cd") : references non-existent secret key: ca.crt Apr 16 17:49:27.011054 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.011011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qb2r\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-kube-api-access-9qb2r\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:27.011054 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.011053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-certificates\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:27.011284 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.011193 2576 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 17:49:27.011284 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.011211 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-46qqw: secret "keda-admission-webhooks-certs" not found Apr 16 17:49:27.011284 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.011258 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-certificates podName:01f8d9f4-5aa4-4077-8690-96a301c1f775 nodeName:}" failed. No retries permitted until 2026-04-16 17:49:27.511244881 +0000 UTC m=+565.527712380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-certificates") pod "keda-admission-cf49989db-46qqw" (UID: "01f8d9f4-5aa4-4077-8690-96a301c1f775") : secret "keda-admission-webhooks-certs" not found Apr 16 17:49:27.023658 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.023631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qb2r\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-kube-api-access-9qb2r\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:27.212535 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.212493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:27.212692 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.212627 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:49:27.212692 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.212644 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:49:27.212692 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.212662 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9: references non-existent secret key: tls.crt Apr 16 17:49:27.212804 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.212718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates podName:5ec126c2-df44-4db9-a1c7-9fd3b94574ae nodeName:}" failed. No retries permitted until 2026-04-16 17:49:28.212701761 +0000 UTC m=+566.229169280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates") pod "keda-metrics-apiserver-7c9f485588-6prp9" (UID: "5ec126c2-df44-4db9-a1c7-9fd3b94574ae") : references non-existent secret key: tls.crt Apr 16 17:49:27.514209 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.514109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-certificates\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:27.516540 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.516517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01f8d9f4-5aa4-4077-8690-96a301c1f775-certificates\") pod \"keda-admission-cf49989db-46qqw\" (UID: \"01f8d9f4-5aa4-4077-8690-96a301c1f775\") " pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:27.782150 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.782060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:27.908539 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.908518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-46qqw"] Apr 16 17:49:27.911429 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:49:27.911403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f8d9f4_5aa4_4077_8690_96a301c1f775.slice/crio-5b8ac93ab5a3071db61581f74d1dfc47e9596b14ca4d66bbae3d402913801764 WatchSource:0}: Error finding container 5b8ac93ab5a3071db61581f74d1dfc47e9596b14ca4d66bbae3d402913801764: Status 404 returned error can't find the container with id 5b8ac93ab5a3071db61581f74d1dfc47e9596b14ca4d66bbae3d402913801764 Apr 16 17:49:27.918471 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:27.918443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:27.918611 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.918594 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 17:49:27.918658 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.918615 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 17:49:27.918658 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.918624 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56mt2: references non-existent secret key: ca.crt Apr 16 17:49:27.918720 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:27.918674 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates podName:90115d25-0a9d-4281-b290-85136d9334cd nodeName:}" failed. No retries permitted until 2026-04-16 17:49:29.918657867 +0000 UTC m=+567.935125387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates") pod "keda-operator-ffbb595cb-56mt2" (UID: "90115d25-0a9d-4281-b290-85136d9334cd") : references non-existent secret key: ca.crt Apr 16 17:49:28.220404 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:28.220360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:28.220593 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:28.220501 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:49:28.220593 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:28.220523 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:49:28.220593 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:28.220541 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9: references non-existent secret key: tls.crt Apr 16 17:49:28.220699 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:28.220594 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates podName:5ec126c2-df44-4db9-a1c7-9fd3b94574ae nodeName:}" failed. No retries permitted until 2026-04-16 17:49:30.220579728 +0000 UTC m=+568.237047228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates") pod "keda-metrics-apiserver-7c9f485588-6prp9" (UID: "5ec126c2-df44-4db9-a1c7-9fd3b94574ae") : references non-existent secret key: tls.crt Apr 16 17:49:28.325387 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:28.325345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-46qqw" event={"ID":"01f8d9f4-5aa4-4077-8690-96a301c1f775","Type":"ContainerStarted","Data":"5b8ac93ab5a3071db61581f74d1dfc47e9596b14ca4d66bbae3d402913801764"} Apr 16 17:49:29.330205 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:29.330169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-46qqw" event={"ID":"01f8d9f4-5aa4-4077-8690-96a301c1f775","Type":"ContainerStarted","Data":"64a56ec667466d7c6a1730b287811f54dfaab556be138ce64dc147a47c34760a"} Apr 16 17:49:29.330611 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:29.330225 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:29.350689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:29.350625 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-46qqw" podStartSLOduration=2.089848082 podStartE2EDuration="3.350604859s" podCreationTimestamp="2026-04-16 17:49:26 +0000 UTC" firstStartedPulling="2026-04-16 17:49:27.912679976 +0000 UTC m=+565.929147475" lastFinishedPulling="2026-04-16 17:49:29.173436749 +0000 UTC m=+567.189904252" observedRunningTime="2026-04-16 17:49:29.349490324 +0000 UTC m=+567.365957845" watchObservedRunningTime="2026-04-16 17:49:29.350604859 +0000 UTC m=+567.367072381" Apr 16 17:49:29.934582 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:29.934545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:29.934772 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:29.934694 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 17:49:29.934772 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:29.934716 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 17:49:29.934772 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:29.934725 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56mt2: references non-existent secret key: ca.crt Apr 16 17:49:29.934873 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:29.934777 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates podName:90115d25-0a9d-4281-b290-85136d9334cd nodeName:}" failed. No retries permitted until 2026-04-16 17:49:33.9347636 +0000 UTC m=+571.951231099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates") pod "keda-operator-ffbb595cb-56mt2" (UID: "90115d25-0a9d-4281-b290-85136d9334cd") : references non-existent secret key: ca.crt Apr 16 17:49:30.235936 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:30.235833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:30.236090 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:30.235991 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:49:30.236090 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:30.236013 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:49:30.236090 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:30.236031 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9: references non-existent secret key: tls.crt Apr 16 17:49:30.236090 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:49:30.236086 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates podName:5ec126c2-df44-4db9-a1c7-9fd3b94574ae nodeName:}" failed. No retries permitted until 2026-04-16 17:49:34.236072167 +0000 UTC m=+572.252539666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates") pod "keda-metrics-apiserver-7c9f485588-6prp9" (UID: "5ec126c2-df44-4db9-a1c7-9fd3b94574ae") : references non-existent secret key: tls.crt Apr 16 17:49:33.965469 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:33.965425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:33.967861 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:33.967838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90115d25-0a9d-4281-b290-85136d9334cd-certificates\") pod \"keda-operator-ffbb595cb-56mt2\" (UID: \"90115d25-0a9d-4281-b290-85136d9334cd\") " pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:34.055504 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.055469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:34.180885 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.180860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-56mt2"] Apr 16 17:49:34.182886 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:49:34.182863 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90115d25_0a9d_4281_b290_85136d9334cd.slice/crio-8f1cda8a59e7e470dd38135971b0c4c46a6a381736c8da8cd529e445e05f3bce WatchSource:0}: Error finding container 8f1cda8a59e7e470dd38135971b0c4c46a6a381736c8da8cd529e445e05f3bce: Status 404 returned error can't find the container with id 8f1cda8a59e7e470dd38135971b0c4c46a6a381736c8da8cd529e445e05f3bce Apr 16 17:49:34.268293 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.268207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:34.270707 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.270674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec126c2-df44-4db9-a1c7-9fd3b94574ae-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6prp9\" (UID: \"5ec126c2-df44-4db9-a1c7-9fd3b94574ae\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:34.348072 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.348028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" event={"ID":"90115d25-0a9d-4281-b290-85136d9334cd","Type":"ContainerStarted","Data":"8f1cda8a59e7e470dd38135971b0c4c46a6a381736c8da8cd529e445e05f3bce"} Apr 16 17:49:34.396148 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.396111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:34.524526 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:34.524450 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9"] Apr 16 17:49:34.527330 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:49:34.527301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec126c2_df44_4db9_a1c7_9fd3b94574ae.slice/crio-66c95421fb1d1da9847265ce63f6ab9b27dcbeb1fda00f45af6fc604d9318225 WatchSource:0}: Error finding container 66c95421fb1d1da9847265ce63f6ab9b27dcbeb1fda00f45af6fc604d9318225: Status 404 returned error can't find the container with id 66c95421fb1d1da9847265ce63f6ab9b27dcbeb1fda00f45af6fc604d9318225 Apr 16 17:49:35.354450 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:35.354415 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" event={"ID":"5ec126c2-df44-4db9-a1c7-9fd3b94574ae","Type":"ContainerStarted","Data":"66c95421fb1d1da9847265ce63f6ab9b27dcbeb1fda00f45af6fc604d9318225"} Apr 16 17:49:38.366704 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:38.366594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" event={"ID":"90115d25-0a9d-4281-b290-85136d9334cd","Type":"ContainerStarted","Data":"dc51c80b084d45218dd6fc9de1b6e6029fcb6349b26b2ef6012a918ec2cf4faa"} Apr 16 17:49:38.367232 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:38.366756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:49:38.368064 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:38.368039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" event={"ID":"5ec126c2-df44-4db9-a1c7-9fd3b94574ae","Type":"ContainerStarted","Data":"d701aa9f543b627635b11c404282af30d1e1f4096db540ab48bf9147399e42ef"} Apr 16 17:49:38.368153 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:38.368140 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:38.389241 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:38.389173 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" podStartSLOduration=8.538932638 podStartE2EDuration="12.389128614s" podCreationTimestamp="2026-04-16 17:49:26 +0000 UTC" firstStartedPulling="2026-04-16 17:49:34.184209412 +0000 UTC m=+572.200676912" lastFinishedPulling="2026-04-16 17:49:38.034405386 +0000 UTC m=+576.050872888" observedRunningTime="2026-04-16 17:49:38.388116071 +0000 UTC m=+576.404583594" watchObservedRunningTime="2026-04-16 17:49:38.389128614 +0000 UTC m=+576.405596137" Apr 16 17:49:38.409873 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:38.409813 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" podStartSLOduration=8.910177089 podStartE2EDuration="12.409795331s" podCreationTimestamp="2026-04-16 17:49:26 +0000 UTC" firstStartedPulling="2026-04-16 17:49:34.528616485 +0000 UTC m=+572.545083984" lastFinishedPulling="2026-04-16 17:49:38.028234725 +0000 UTC m=+576.044702226" observedRunningTime="2026-04-16 17:49:38.408854118 +0000 UTC m=+576.425321640" watchObservedRunningTime="2026-04-16 17:49:38.409795331 +0000 UTC m=+576.426262866" Apr 16 17:49:47.324385 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:47.324355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xbxgs" Apr 16 17:49:49.375507 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:49.375480 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6prp9" Apr 16 17:49:50.335305 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:50.335271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-46qqw" Apr 16 17:49:59.373978 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:49:59.373945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-56mt2" Apr 16 17:50:02.511823 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:02.511788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:50:02.512438 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:02.512417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:50:19.049480 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.049442 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8"] Apr 16 17:50:19.060875 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.060846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.062804 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.062772 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8"] Apr 16 17:50:19.063668 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.063651 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:50:19.064852 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.064827 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l58bq\"" Apr 16 17:50:19.064974 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.064885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:50:19.228774 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.228733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.228970 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.228788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqk79\" (UniqueName: \"kubernetes.io/projected/3001bebe-2cbe-4356-ae3e-f13cd66e870c-kube-api-access-gqk79\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.228970 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.228883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.330281 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.330185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.330281 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.330250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqk79\" (UniqueName: \"kubernetes.io/projected/3001bebe-2cbe-4356-ae3e-f13cd66e870c-kube-api-access-gqk79\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.330488 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.330294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.330581 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.330561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.330634 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.330588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.343294 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.343270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqk79\" (UniqueName: \"kubernetes.io/projected/3001bebe-2cbe-4356-ae3e-f13cd66e870c-kube-api-access-gqk79\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.371849 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.371820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:19.497114 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:19.497082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8"] Apr 16 17:50:19.499673 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:50:19.499644 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3001bebe_2cbe_4356_ae3e_f13cd66e870c.slice/crio-9e1fcc0d576eb8952ebaef48d9db66d98f00357b32c8ed66f744a873e28cbda1 WatchSource:0}: Error finding container 9e1fcc0d576eb8952ebaef48d9db66d98f00357b32c8ed66f744a873e28cbda1: Status 404 returned error can't find the container with id 9e1fcc0d576eb8952ebaef48d9db66d98f00357b32c8ed66f744a873e28cbda1 Apr 16 17:50:20.504893 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:20.504857 2576 generic.go:358] "Generic (PLEG): container finished" podID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerID="59d904d8e85af769812b15d70c193929285764411efd02a253bdb96e4580cf50" exitCode=0 Apr 16 17:50:20.505282 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:20.504943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" event={"ID":"3001bebe-2cbe-4356-ae3e-f13cd66e870c","Type":"ContainerDied","Data":"59d904d8e85af769812b15d70c193929285764411efd02a253bdb96e4580cf50"} Apr 16 17:50:20.505282 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:20.504978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" event={"ID":"3001bebe-2cbe-4356-ae3e-f13cd66e870c","Type":"ContainerStarted","Data":"9e1fcc0d576eb8952ebaef48d9db66d98f00357b32c8ed66f744a873e28cbda1"} Apr 16 17:50:21.510356 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:21.510260 2576 generic.go:358] "Generic (PLEG): container finished" podID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerID="f2a269500c02c4c24678ac27772eb9a9887cbf1b7152ba15a24aef860835db71" exitCode=0 Apr 16 17:50:21.510356 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:21.510334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" event={"ID":"3001bebe-2cbe-4356-ae3e-f13cd66e870c","Type":"ContainerDied","Data":"f2a269500c02c4c24678ac27772eb9a9887cbf1b7152ba15a24aef860835db71"} Apr 16 17:50:22.515655 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:22.515619 2576 generic.go:358] "Generic (PLEG): container finished" podID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerID="b77be5518ef5b8c3d9f504dadc7c12191379826419930a6f7e4fd8285318cdad" exitCode=0 Apr 16 17:50:22.516058 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:22.515680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" event={"ID":"3001bebe-2cbe-4356-ae3e-f13cd66e870c","Type":"ContainerDied","Data":"b77be5518ef5b8c3d9f504dadc7c12191379826419930a6f7e4fd8285318cdad"} Apr 16 17:50:23.638046 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.638015 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:23.768640 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.768605 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqk79\" (UniqueName: \"kubernetes.io/projected/3001bebe-2cbe-4356-ae3e-f13cd66e870c-kube-api-access-gqk79\") pod \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " Apr 16 17:50:23.768640 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.768649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-bundle\") pod \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " Apr 16 17:50:23.768854 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.768688 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-util\") pod \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\" (UID: \"3001bebe-2cbe-4356-ae3e-f13cd66e870c\") " Apr 16 17:50:23.769439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.769362 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-bundle" (OuterVolumeSpecName: "bundle") pod "3001bebe-2cbe-4356-ae3e-f13cd66e870c" (UID: "3001bebe-2cbe-4356-ae3e-f13cd66e870c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:23.770889 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.770863 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3001bebe-2cbe-4356-ae3e-f13cd66e870c-kube-api-access-gqk79" (OuterVolumeSpecName: "kube-api-access-gqk79") pod "3001bebe-2cbe-4356-ae3e-f13cd66e870c" (UID: "3001bebe-2cbe-4356-ae3e-f13cd66e870c"). InnerVolumeSpecName "kube-api-access-gqk79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:50:23.774354 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.774329 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-util" (OuterVolumeSpecName: "util") pod "3001bebe-2cbe-4356-ae3e-f13cd66e870c" (UID: "3001bebe-2cbe-4356-ae3e-f13cd66e870c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:23.869630 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.869590 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqk79\" (UniqueName: \"kubernetes.io/projected/3001bebe-2cbe-4356-ae3e-f13cd66e870c-kube-api-access-gqk79\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:50:23.869630 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.869622 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:50:23.869630 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:23.869633 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3001bebe-2cbe-4356-ae3e-f13cd66e870c-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:50:24.524076 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:24.524035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" event={"ID":"3001bebe-2cbe-4356-ae3e-f13cd66e870c","Type":"ContainerDied","Data":"9e1fcc0d576eb8952ebaef48d9db66d98f00357b32c8ed66f744a873e28cbda1"} Apr 16 17:50:24.524076 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:24.524075 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1fcc0d576eb8952ebaef48d9db66d98f00357b32c8ed66f744a873e28cbda1" Apr 16 17:50:24.524076 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:24.524083 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hdzm8" Apr 16 17:50:31.371469 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371433 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj"] Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371710 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="util" Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371720 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="util" Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371749 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="pull" Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371755 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="pull" Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371766 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="extract" Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371773 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="extract" Apr 16 17:50:31.371973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.371822 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3001bebe-2cbe-4356-ae3e-f13cd66e870c" containerName="extract" Apr 16 17:50:31.374514 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.374497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.378807 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.378777 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 17:50:31.378807 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.378801 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-hszvg\"" Apr 16 17:50:31.379011 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.378864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:50:31.393644 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.393619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj"] Apr 16 17:50:31.427889 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.427849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfflq\" (UniqueName: \"kubernetes.io/projected/27749153-8c42-4620-9ca0-0f29ded47fe8-kube-api-access-jfflq\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8jlnj\" (UID: \"27749153-8c42-4620-9ca0-0f29ded47fe8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.428079 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.427903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27749153-8c42-4620-9ca0-0f29ded47fe8-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8jlnj\" (UID: \"27749153-8c42-4620-9ca0-0f29ded47fe8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.529228 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.529192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfflq\" (UniqueName: \"kubernetes.io/projected/27749153-8c42-4620-9ca0-0f29ded47fe8-kube-api-access-jfflq\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8jlnj\" (UID: \"27749153-8c42-4620-9ca0-0f29ded47fe8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.529356 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.529235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27749153-8c42-4620-9ca0-0f29ded47fe8-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8jlnj\" (UID: \"27749153-8c42-4620-9ca0-0f29ded47fe8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.529599 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.529584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27749153-8c42-4620-9ca0-0f29ded47fe8-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8jlnj\" (UID: \"27749153-8c42-4620-9ca0-0f29ded47fe8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.539364 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.539336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfflq\" (UniqueName: \"kubernetes.io/projected/27749153-8c42-4620-9ca0-0f29ded47fe8-kube-api-access-jfflq\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8jlnj\" (UID: \"27749153-8c42-4620-9ca0-0f29ded47fe8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.683567 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.683538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" Apr 16 17:50:31.821493 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:31.821469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj"] Apr 16 17:50:31.823173 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:50:31.823143 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27749153_8c42_4620_9ca0_0f29ded47fe8.slice/crio-a642071e507046ead7f3021c3c1d60bb4b1aff8af89ea539ac77f6bd344ba218 WatchSource:0}: Error finding container a642071e507046ead7f3021c3c1d60bb4b1aff8af89ea539ac77f6bd344ba218: Status 404 returned error can't find the container with id a642071e507046ead7f3021c3c1d60bb4b1aff8af89ea539ac77f6bd344ba218 Apr 16 17:50:32.555922 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:32.555875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" event={"ID":"27749153-8c42-4620-9ca0-0f29ded47fe8","Type":"ContainerStarted","Data":"a642071e507046ead7f3021c3c1d60bb4b1aff8af89ea539ac77f6bd344ba218"} Apr 16 17:50:34.564174 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:34.564135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" event={"ID":"27749153-8c42-4620-9ca0-0f29ded47fe8","Type":"ContainerStarted","Data":"b5137ae41512fcee268faf803d50a99d90940eddeb6ff78ca4c3810af617e6e1"} Apr 16 17:50:34.590322 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:34.590269 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8jlnj" podStartSLOduration=1.7996054259999998 podStartE2EDuration="3.590253614s" podCreationTimestamp="2026-04-16 17:50:31 +0000 UTC" firstStartedPulling="2026-04-16 17:50:31.825820962 +0000 UTC m=+629.842288470" lastFinishedPulling="2026-04-16 17:50:33.616469156 +0000 UTC m=+631.632936658" observedRunningTime="2026-04-16 17:50:34.588249321 +0000 UTC m=+632.604716841" watchObservedRunningTime="2026-04-16 17:50:34.590253614 +0000 UTC m=+632.606721180" Apr 16 17:50:41.786179 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.786140 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x"] Apr 16 17:50:41.789783 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.789760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.799961 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.799938 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:50:41.800088 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.799964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l58bq\"" Apr 16 17:50:41.800088 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.800030 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:50:41.803049 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.803028 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x"] Apr 16 17:50:41.806811 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.806789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwnwn\" (UniqueName: \"kubernetes.io/projected/06dc6323-524b-4ca1-938c-d5ab00bff10d-kube-api-access-xwnwn\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.806921 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.806847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.806988 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.806966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.907687 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.907649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.907888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.907710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.907888 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.907736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwnwn\" (UniqueName: \"kubernetes.io/projected/06dc6323-524b-4ca1-938c-d5ab00bff10d-kube-api-access-xwnwn\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.908067 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.908050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.908108 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.908086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:41.918548 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:41.918521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwnwn\" (UniqueName: \"kubernetes.io/projected/06dc6323-524b-4ca1-938c-d5ab00bff10d-kube-api-access-xwnwn\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:42.098897 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:42.098807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:42.225119 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:42.225090 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x"] Apr 16 17:50:42.227555 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:50:42.227516 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06dc6323_524b_4ca1_938c_d5ab00bff10d.slice/crio-5310676a42520b1c700f264134568460010ef3a3454706ded9fff7fcedc1d55a WatchSource:0}: Error finding container 5310676a42520b1c700f264134568460010ef3a3454706ded9fff7fcedc1d55a: Status 404 returned error can't find the container with id 5310676a42520b1c700f264134568460010ef3a3454706ded9fff7fcedc1d55a Apr 16 17:50:42.589829 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:42.589790 2576 generic.go:358] "Generic (PLEG): container finished" podID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerID="031a296160c21c687435f249f46a2d05a77cedb8ebe8f6f8eacf36c985d61bb2" exitCode=0 Apr 16 17:50:42.590049 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:42.589879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" event={"ID":"06dc6323-524b-4ca1-938c-d5ab00bff10d","Type":"ContainerDied","Data":"031a296160c21c687435f249f46a2d05a77cedb8ebe8f6f8eacf36c985d61bb2"} Apr 16 17:50:42.590049 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:42.589935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" event={"ID":"06dc6323-524b-4ca1-938c-d5ab00bff10d","Type":"ContainerStarted","Data":"5310676a42520b1c700f264134568460010ef3a3454706ded9fff7fcedc1d55a"} Apr 16 17:50:45.601354 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:45.601307 2576 generic.go:358] "Generic (PLEG): container finished" podID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerID="c3fb8dbf143785b52c0e7cd3d5a2c2c0781e358089645136411739250ba432fe" exitCode=0 Apr 16 17:50:45.601859 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:45.601409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" event={"ID":"06dc6323-524b-4ca1-938c-d5ab00bff10d","Type":"ContainerDied","Data":"c3fb8dbf143785b52c0e7cd3d5a2c2c0781e358089645136411739250ba432fe"} Apr 16 17:50:46.606141 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:46.606107 2576 generic.go:358] "Generic (PLEG): container finished" podID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerID="0055dbc940fc329cf909db95e73c6f6f7a72a658f6384d384aaf4353c87c9c86" exitCode=0 Apr 16 17:50:46.606518 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:46.606183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" event={"ID":"06dc6323-524b-4ca1-938c-d5ab00bff10d","Type":"ContainerDied","Data":"0055dbc940fc329cf909db95e73c6f6f7a72a658f6384d384aaf4353c87c9c86"} Apr 16 17:50:47.728652 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.728624 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:47.759551 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.759516 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-bundle\") pod \"06dc6323-524b-4ca1-938c-d5ab00bff10d\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " Apr 16 17:50:47.759718 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.759587 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-util\") pod \"06dc6323-524b-4ca1-938c-d5ab00bff10d\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " Apr 16 17:50:47.759718 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.759675 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwnwn\" (UniqueName: \"kubernetes.io/projected/06dc6323-524b-4ca1-938c-d5ab00bff10d-kube-api-access-xwnwn\") pod \"06dc6323-524b-4ca1-938c-d5ab00bff10d\" (UID: \"06dc6323-524b-4ca1-938c-d5ab00bff10d\") " Apr 16 17:50:47.759924 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.759886 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-bundle" (OuterVolumeSpecName: "bundle") pod "06dc6323-524b-4ca1-938c-d5ab00bff10d" (UID: "06dc6323-524b-4ca1-938c-d5ab00bff10d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:47.761882 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.761853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06dc6323-524b-4ca1-938c-d5ab00bff10d-kube-api-access-xwnwn" (OuterVolumeSpecName: "kube-api-access-xwnwn") pod "06dc6323-524b-4ca1-938c-d5ab00bff10d" (UID: "06dc6323-524b-4ca1-938c-d5ab00bff10d"). InnerVolumeSpecName "kube-api-access-xwnwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:50:47.765377 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.765349 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-util" (OuterVolumeSpecName: "util") pod "06dc6323-524b-4ca1-938c-d5ab00bff10d" (UID: "06dc6323-524b-4ca1-938c-d5ab00bff10d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:47.860828 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.860729 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwnwn\" (UniqueName: \"kubernetes.io/projected/06dc6323-524b-4ca1-938c-d5ab00bff10d-kube-api-access-xwnwn\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:50:47.860828 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.860775 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:50:47.860828 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:47.860786 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06dc6323-524b-4ca1-938c-d5ab00bff10d-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:50:48.613882 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:48.613853 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" Apr 16 17:50:48.614065 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:48.613844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftjl7x" event={"ID":"06dc6323-524b-4ca1-938c-d5ab00bff10d","Type":"ContainerDied","Data":"5310676a42520b1c700f264134568460010ef3a3454706ded9fff7fcedc1d55a"} Apr 16 17:50:48.614065 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:50:48.613946 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5310676a42520b1c700f264134568460010ef3a3454706ded9fff7fcedc1d55a" Apr 16 17:51:09.414273 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414236 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv"] Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414542 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="pull" Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414553 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="pull" Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414563 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="util" Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414568 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="util" Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414585 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="extract" Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414594 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="extract" Apr 16 17:51:09.414872 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.414650 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="06dc6323-524b-4ca1-938c-d5ab00bff10d" containerName="extract" Apr 16 17:51:09.421266 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.421246 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.424197 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.424176 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l58bq\"" Apr 16 17:51:09.424309 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.424176 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:51:09.424309 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.424177 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:51:09.427567 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.427543 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv"] Apr 16 17:51:09.531623 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.531581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hvml\" (UniqueName: \"kubernetes.io/projected/3303daaf-8764-4a31-8ac0-2c8aad25e000-kube-api-access-7hvml\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.531798 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.531666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.531798 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.531705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.632664 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.632619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.632895 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.632687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.632895 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.632709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hvml\" (UniqueName: \"kubernetes.io/projected/3303daaf-8764-4a31-8ac0-2c8aad25e000-kube-api-access-7hvml\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.633106 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.633008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.633106 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.633028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.642847 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.642818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hvml\" (UniqueName: \"kubernetes.io/projected/3303daaf-8764-4a31-8ac0-2c8aad25e000-kube-api-access-7hvml\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.730862 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.730768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:09.863211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:09.863184 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv"] Apr 16 17:51:09.865606 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:51:09.865577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3303daaf_8764_4a31_8ac0_2c8aad25e000.slice/crio-cde3535899a65b979a615e3bc21499150c4d5e9b2a89b6166ae1ddbd752f1416 WatchSource:0}: Error finding container cde3535899a65b979a615e3bc21499150c4d5e9b2a89b6166ae1ddbd752f1416: Status 404 returned error can't find the container with id cde3535899a65b979a615e3bc21499150c4d5e9b2a89b6166ae1ddbd752f1416 Apr 16 17:51:10.687441 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:10.687409 2576 generic.go:358] "Generic (PLEG): container finished" podID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerID="f0e6af8ecc204a88c1ea0e1e056e5b9ade77f0013a563b4e53377d2107666827" exitCode=0 Apr 16 17:51:10.687831 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:10.687467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" event={"ID":"3303daaf-8764-4a31-8ac0-2c8aad25e000","Type":"ContainerDied","Data":"f0e6af8ecc204a88c1ea0e1e056e5b9ade77f0013a563b4e53377d2107666827"} Apr 16 17:51:10.687831 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:10.687488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" event={"ID":"3303daaf-8764-4a31-8ac0-2c8aad25e000","Type":"ContainerStarted","Data":"cde3535899a65b979a615e3bc21499150c4d5e9b2a89b6166ae1ddbd752f1416"} Apr 16 17:51:11.692074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:11.692040 2576 generic.go:358] "Generic (PLEG): container finished" podID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerID="bd6f088ed877cca3a79303674092259d10a463da716aa3f20acf030423c96447" exitCode=0 Apr 16 17:51:11.692469 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:11.692124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" event={"ID":"3303daaf-8764-4a31-8ac0-2c8aad25e000","Type":"ContainerDied","Data":"bd6f088ed877cca3a79303674092259d10a463da716aa3f20acf030423c96447"} Apr 16 17:51:12.697277 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:12.697245 2576 generic.go:358] "Generic (PLEG): container finished" podID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerID="d4184faa25c1379859c9ca11e8d929d9c2c891844e5200f8a428b307a373d8bd" exitCode=0 Apr 16 17:51:12.697663 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:12.697326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" event={"ID":"3303daaf-8764-4a31-8ac0-2c8aad25e000","Type":"ContainerDied","Data":"d4184faa25c1379859c9ca11e8d929d9c2c891844e5200f8a428b307a373d8bd"} Apr 16 17:51:13.820593 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.820570 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:13.965831 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.965743 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hvml\" (UniqueName: \"kubernetes.io/projected/3303daaf-8764-4a31-8ac0-2c8aad25e000-kube-api-access-7hvml\") pod \"3303daaf-8764-4a31-8ac0-2c8aad25e000\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " Apr 16 17:51:13.965831 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.965815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-bundle\") pod \"3303daaf-8764-4a31-8ac0-2c8aad25e000\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " Apr 16 17:51:13.966054 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.965873 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-util\") pod \"3303daaf-8764-4a31-8ac0-2c8aad25e000\" (UID: \"3303daaf-8764-4a31-8ac0-2c8aad25e000\") " Apr 16 17:51:13.966663 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.966636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-bundle" (OuterVolumeSpecName: "bundle") pod "3303daaf-8764-4a31-8ac0-2c8aad25e000" (UID: "3303daaf-8764-4a31-8ac0-2c8aad25e000"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:51:13.967842 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.967813 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3303daaf-8764-4a31-8ac0-2c8aad25e000-kube-api-access-7hvml" (OuterVolumeSpecName: "kube-api-access-7hvml") pod "3303daaf-8764-4a31-8ac0-2c8aad25e000" (UID: "3303daaf-8764-4a31-8ac0-2c8aad25e000"). InnerVolumeSpecName "kube-api-access-7hvml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:51:13.971527 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:13.971500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-util" (OuterVolumeSpecName: "util") pod "3303daaf-8764-4a31-8ac0-2c8aad25e000" (UID: "3303daaf-8764-4a31-8ac0-2c8aad25e000"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:51:14.066677 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:14.066642 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:51:14.066677 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:14.066673 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3303daaf-8764-4a31-8ac0-2c8aad25e000-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:51:14.066677 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:14.066684 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hvml\" (UniqueName: \"kubernetes.io/projected/3303daaf-8764-4a31-8ac0-2c8aad25e000-kube-api-access-7hvml\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:51:14.705723 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:14.705688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" event={"ID":"3303daaf-8764-4a31-8ac0-2c8aad25e000","Type":"ContainerDied","Data":"cde3535899a65b979a615e3bc21499150c4d5e9b2a89b6166ae1ddbd752f1416"} Apr 16 17:51:14.705723 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:14.705726 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cde3535899a65b979a615e3bc21499150c4d5e9b2a89b6166ae1ddbd752f1416" Apr 16 17:51:14.705955 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:14.705738 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354t6gv" Apr 16 17:51:23.980358 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.980321 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv"] Apr 16 17:51:23.981433 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981383 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="util" Apr 16 17:51:23.981538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981436 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="util" Apr 16 17:51:23.981538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981473 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="extract" Apr 16 17:51:23.981538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981483 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="extract" Apr 16 17:51:23.981538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981496 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="pull" Apr 16 17:51:23.981538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981506 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="pull" Apr 16 17:51:23.981773 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.981664 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3303daaf-8764-4a31-8ac0-2c8aad25e000" containerName="extract" Apr 16 17:51:23.987021 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.986990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:23.993786 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.993767 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:51:23.994877 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.994858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:51:23.995700 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:23.995682 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l58bq\"" Apr 16 17:51:24.019369 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.019340 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv"] Apr 16 17:51:24.152424 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.152390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.152424 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.152429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvm44\" (UniqueName: \"kubernetes.io/projected/c3d09783-174c-432a-a8a4-fc480c2d6bb3-kube-api-access-nvm44\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.152694 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.152564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.253304 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.253218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.253304 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.253269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvm44\" (UniqueName: \"kubernetes.io/projected/c3d09783-174c-432a-a8a4-fc480c2d6bb3-kube-api-access-nvm44\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.253545 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.253362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.253633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.253614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.253694 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.253678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.268861 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.268817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvm44\" (UniqueName: \"kubernetes.io/projected/c3d09783-174c-432a-a8a4-fc480c2d6bb3-kube-api-access-nvm44\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.296429 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.296392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:24.424181 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.424144 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv"] Apr 16 17:51:24.428060 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:51:24.428025 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3d09783_174c_432a_a8a4_fc480c2d6bb3.slice/crio-15b42462335625a0dbc23ae0f308ccc3faf6b5d7c7da1956bdd9425266e80d4e WatchSource:0}: Error finding container 15b42462335625a0dbc23ae0f308ccc3faf6b5d7c7da1956bdd9425266e80d4e: Status 404 returned error can't find the container with id 15b42462335625a0dbc23ae0f308ccc3faf6b5d7c7da1956bdd9425266e80d4e Apr 16 17:51:24.738702 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.738670 2576 generic.go:358] "Generic (PLEG): container finished" podID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerID="4428f5234aadf2798759a4830a65b13e760ee6121cb2408fb301945dc9ba79e5" exitCode=0 Apr 16 17:51:24.738881 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.738728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" event={"ID":"c3d09783-174c-432a-a8a4-fc480c2d6bb3","Type":"ContainerDied","Data":"4428f5234aadf2798759a4830a65b13e760ee6121cb2408fb301945dc9ba79e5"} Apr 16 17:51:24.738881 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:24.738754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" event={"ID":"c3d09783-174c-432a-a8a4-fc480c2d6bb3","Type":"ContainerStarted","Data":"15b42462335625a0dbc23ae0f308ccc3faf6b5d7c7da1956bdd9425266e80d4e"} Apr 16 17:51:26.243668 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.243634 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj"] Apr 16 17:51:26.246720 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.246704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.250448 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.250430 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 17:51:26.250965 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.250948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 17:51:26.251697 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.251680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-fbbhx\"" Apr 16 17:51:26.257947 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.257903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj"] Apr 16 17:51:26.371648 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.371613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rnhp\" (UniqueName: \"kubernetes.io/projected/85beff7e-b12a-48aa-a9a7-eb7bb35a110d-kube-api-access-5rnhp\") pod \"servicemesh-operator3-55f49c5f94-m8vtj\" (UID: \"85beff7e-b12a-48aa-a9a7-eb7bb35a110d\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.371648 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.371662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/85beff7e-b12a-48aa-a9a7-eb7bb35a110d-operator-config\") pod \"servicemesh-operator3-55f49c5f94-m8vtj\" (UID: \"85beff7e-b12a-48aa-a9a7-eb7bb35a110d\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.472922 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.472879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rnhp\" (UniqueName: \"kubernetes.io/projected/85beff7e-b12a-48aa-a9a7-eb7bb35a110d-kube-api-access-5rnhp\") pod \"servicemesh-operator3-55f49c5f94-m8vtj\" (UID: \"85beff7e-b12a-48aa-a9a7-eb7bb35a110d\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.473088 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.472958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/85beff7e-b12a-48aa-a9a7-eb7bb35a110d-operator-config\") pod \"servicemesh-operator3-55f49c5f94-m8vtj\" (UID: \"85beff7e-b12a-48aa-a9a7-eb7bb35a110d\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.475420 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.475400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/85beff7e-b12a-48aa-a9a7-eb7bb35a110d-operator-config\") pod \"servicemesh-operator3-55f49c5f94-m8vtj\" (UID: \"85beff7e-b12a-48aa-a9a7-eb7bb35a110d\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.489285 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.489250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rnhp\" (UniqueName: \"kubernetes.io/projected/85beff7e-b12a-48aa-a9a7-eb7bb35a110d-kube-api-access-5rnhp\") pod \"servicemesh-operator3-55f49c5f94-m8vtj\" (UID: \"85beff7e-b12a-48aa-a9a7-eb7bb35a110d\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.555383 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.555282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:26.706004 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:51:26.705962 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85beff7e_b12a_48aa_a9a7_eb7bb35a110d.slice/crio-abbd278b813672d12047d729e3f7e3205185b7ad8c282334f9a5d10c35fc1fa3 WatchSource:0}: Error finding container abbd278b813672d12047d729e3f7e3205185b7ad8c282334f9a5d10c35fc1fa3: Status 404 returned error can't find the container with id abbd278b813672d12047d729e3f7e3205185b7ad8c282334f9a5d10c35fc1fa3 Apr 16 17:51:26.708158 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.708129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj"] Apr 16 17:51:26.746320 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.746281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" event={"ID":"85beff7e-b12a-48aa-a9a7-eb7bb35a110d","Type":"ContainerStarted","Data":"abbd278b813672d12047d729e3f7e3205185b7ad8c282334f9a5d10c35fc1fa3"} Apr 16 17:51:26.747831 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.747806 2576 generic.go:358] "Generic (PLEG): container finished" podID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerID="10bd9189a6e7672f87bac51dfcd175152cac94f6e94bbd9e6a344e2a7dd0eea9" exitCode=0 Apr 16 17:51:26.747990 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:26.747848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" event={"ID":"c3d09783-174c-432a-a8a4-fc480c2d6bb3","Type":"ContainerDied","Data":"10bd9189a6e7672f87bac51dfcd175152cac94f6e94bbd9e6a344e2a7dd0eea9"} Apr 16 17:51:27.753382 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:27.753344 2576 generic.go:358] "Generic (PLEG): container finished" podID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerID="40a453088aa7afbeb62fedb67252bb95a079cca2307c8f41ef2177cca08bfbfa" exitCode=0 Apr 16 17:51:27.753815 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:27.753445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" event={"ID":"c3d09783-174c-432a-a8a4-fc480c2d6bb3","Type":"ContainerDied","Data":"40a453088aa7afbeb62fedb67252bb95a079cca2307c8f41ef2177cca08bfbfa"} Apr 16 17:51:28.987153 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:28.987124 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:29.096991 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.096962 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-bundle\") pod \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " Apr 16 17:51:29.097125 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.097031 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvm44\" (UniqueName: \"kubernetes.io/projected/c3d09783-174c-432a-a8a4-fc480c2d6bb3-kube-api-access-nvm44\") pod \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " Apr 16 17:51:29.097183 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.097132 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-util\") pod \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\" (UID: \"c3d09783-174c-432a-a8a4-fc480c2d6bb3\") " Apr 16 17:51:29.098182 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.098154 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-bundle" (OuterVolumeSpecName: "bundle") pod "c3d09783-174c-432a-a8a4-fc480c2d6bb3" (UID: "c3d09783-174c-432a-a8a4-fc480c2d6bb3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:51:29.099177 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.099152 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d09783-174c-432a-a8a4-fc480c2d6bb3-kube-api-access-nvm44" (OuterVolumeSpecName: "kube-api-access-nvm44") pod "c3d09783-174c-432a-a8a4-fc480c2d6bb3" (UID: "c3d09783-174c-432a-a8a4-fc480c2d6bb3"). InnerVolumeSpecName "kube-api-access-nvm44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:51:29.105291 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.105257 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-util" (OuterVolumeSpecName: "util") pod "c3d09783-174c-432a-a8a4-fc480c2d6bb3" (UID: "c3d09783-174c-432a-a8a4-fc480c2d6bb3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:51:29.198595 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.198538 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvm44\" (UniqueName: \"kubernetes.io/projected/c3d09783-174c-432a-a8a4-fc480c2d6bb3-kube-api-access-nvm44\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:51:29.198595 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.198591 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:51:29.198595 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.198605 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3d09783-174c-432a-a8a4-fc480c2d6bb3-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:51:29.762273 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.762239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" event={"ID":"c3d09783-174c-432a-a8a4-fc480c2d6bb3","Type":"ContainerDied","Data":"15b42462335625a0dbc23ae0f308ccc3faf6b5d7c7da1956bdd9425266e80d4e"} Apr 16 17:51:29.762273 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.762272 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15b42462335625a0dbc23ae0f308ccc3faf6b5d7c7da1956bdd9425266e80d4e" Apr 16 17:51:29.762273 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.762272 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2xkqgv" Apr 16 17:51:29.763694 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.763666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" event={"ID":"85beff7e-b12a-48aa-a9a7-eb7bb35a110d","Type":"ContainerStarted","Data":"f78676f8cadca6d7176d7ec95327d7bff61faa0af7963c32142a195424177033"} Apr 16 17:51:29.763806 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.763786 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:29.788234 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:29.788173 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" podStartSLOduration=1.460873946 podStartE2EDuration="3.788155363s" podCreationTimestamp="2026-04-16 17:51:26 +0000 UTC" firstStartedPulling="2026-04-16 17:51:26.709143977 +0000 UTC m=+684.725611476" lastFinishedPulling="2026-04-16 17:51:29.036425391 +0000 UTC m=+687.052892893" observedRunningTime="2026-04-16 17:51:29.785464804 +0000 UTC m=+687.801932324" watchObservedRunningTime="2026-04-16 17:51:29.788155363 +0000 UTC m=+687.804622885" Apr 16 17:51:40.769297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:40.769260 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-m8vtj" Apr 16 17:51:44.419042 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419001 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z"] Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419323 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="extract" Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419335 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="extract" Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419348 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="pull" Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419354 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="pull" Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419365 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="util" Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419370 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="util" Apr 16 17:51:44.419512 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.419420 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3d09783-174c-432a-a8a4-fc480c2d6bb3" containerName="extract" Apr 16 17:51:44.423416 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.423398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.426338 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.426304 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 17:51:44.426491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.426414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 17:51:44.426491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.426418 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 17:51:44.426491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.426423 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 17:51:44.426491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.426473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-qcjkg\"" Apr 16 17:51:44.435476 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.435452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z"] Apr 16 17:51:44.515442 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.515442 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a0b91a-9bc8-4e6a-8270-bcf005283af3-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.515657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.515657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdlr8\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-kube-api-access-zdlr8\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.515657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.515657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.515657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.515651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616460 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616657 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a0b91a-9bc8-4e6a-8270-bcf005283af3-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616833 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616833 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdlr8\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-kube-api-access-zdlr8\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.616951 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.616872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.617383 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.617361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.619253 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.619229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a0b91a-9bc8-4e6a-8270-bcf005283af3-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.619462 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.619441 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.619534 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.619447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.619534 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.619498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.627637 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.627589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdlr8\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-kube-api-access-zdlr8\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.628035 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.628012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-t4r8z\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.733368 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.733256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:44.879833 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:44.879799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z"] Apr 16 17:51:44.883445 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:51:44.883417 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a0b91a_9bc8_4e6a_8270_bcf005283af3.slice/crio-9d96e5b30e1ad1ee5626c63cfb66c8f524abc48a079f82e27f29ac9e3c6a7cdd WatchSource:0}: Error finding container 9d96e5b30e1ad1ee5626c63cfb66c8f524abc48a079f82e27f29ac9e3c6a7cdd: Status 404 returned error can't find the container with id 9d96e5b30e1ad1ee5626c63cfb66c8f524abc48a079f82e27f29ac9e3c6a7cdd Apr 16 17:51:45.825051 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:45.825009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" event={"ID":"00a0b91a-9bc8-4e6a-8270-bcf005283af3","Type":"ContainerStarted","Data":"9d96e5b30e1ad1ee5626c63cfb66c8f524abc48a079f82e27f29ac9e3c6a7cdd"} Apr 16 17:51:47.437517 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:47.437479 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:51:47.437828 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:47.437545 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:51:47.834215 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:47.834123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" event={"ID":"00a0b91a-9bc8-4e6a-8270-bcf005283af3","Type":"ContainerStarted","Data":"f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406"} Apr 16 17:51:47.834369 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:47.834252 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:47.862284 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:47.862223 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" podStartSLOduration=1.3107103 podStartE2EDuration="3.862206417s" podCreationTimestamp="2026-04-16 17:51:44 +0000 UTC" firstStartedPulling="2026-04-16 17:51:44.885753612 +0000 UTC m=+702.902221111" lastFinishedPulling="2026-04-16 17:51:47.437249711 +0000 UTC m=+705.453717228" observedRunningTime="2026-04-16 17:51:47.860138202 +0000 UTC m=+705.876605725" watchObservedRunningTime="2026-04-16 17:51:47.862206417 +0000 UTC m=+705.878673951" Apr 16 17:51:48.839925 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:48.839885 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:51:51.464162 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.464131 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc"] Apr 16 17:51:51.467556 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.467531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.470973 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.470949 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-pjlnx\"" Apr 16 17:51:51.485279 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.485246 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc"] Apr 16 17:51:51.578879 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.578847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.578879 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.578881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579111 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.578919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579111 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.578951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579111 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.579021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579111 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.579050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579111 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.579090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmxc\" (UniqueName: \"kubernetes.io/projected/bebca98f-660f-43f3-90d4-f1020f766e4f-kube-api-access-6wmxc\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579271 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.579121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bebca98f-660f-43f3-90d4-f1020f766e4f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.579271 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.579140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680499 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bebca98f-660f-43f3-90d4-f1020f766e4f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680499 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.680754 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.681075 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.680766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmxc\" (UniqueName: \"kubernetes.io/projected/bebca98f-660f-43f3-90d4-f1020f766e4f-kube-api-access-6wmxc\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.681075 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.681027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.681259 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.681234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.681326 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.681300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bebca98f-660f-43f3-90d4-f1020f766e4f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.681373 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.681293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.681373 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.681325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.683305 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.683278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.683443 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.683323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.693823 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.693796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bebca98f-660f-43f3-90d4-f1020f766e4f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.693951 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.693826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmxc\" (UniqueName: \"kubernetes.io/projected/bebca98f-660f-43f3-90d4-f1020f766e4f-kube-api-access-6wmxc\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-4ltxc\" (UID: \"bebca98f-660f-43f3-90d4-f1020f766e4f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.779686 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.779591 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:51.946747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:51.946717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc"] Apr 16 17:51:51.949784 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:51:51.949753 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebca98f_660f_43f3_90d4_f1020f766e4f.slice/crio-431b8e9493f678ab436eef62c802dab6191c51803e95454d824e15240d329a2c WatchSource:0}: Error finding container 431b8e9493f678ab436eef62c802dab6191c51803e95454d824e15240d329a2c: Status 404 returned error can't find the container with id 431b8e9493f678ab436eef62c802dab6191c51803e95454d824e15240d329a2c Apr 16 17:51:52.857006 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:52.856968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" event={"ID":"bebca98f-660f-43f3-90d4-f1020f766e4f","Type":"ContainerStarted","Data":"431b8e9493f678ab436eef62c802dab6191c51803e95454d824e15240d329a2c"} Apr 16 17:51:54.671689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:54.671654 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:51:54.672074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:54.671734 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:51:54.672074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:54.671779 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:51:54.865572 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:54.865529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" event={"ID":"bebca98f-660f-43f3-90d4-f1020f766e4f","Type":"ContainerStarted","Data":"dd89be590b11e9860f65f239d59819aec98f3965567df20f0835b896d9946378"} Apr 16 17:51:54.895631 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:54.895574 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" podStartSLOduration=1.175821155 podStartE2EDuration="3.895556604s" podCreationTimestamp="2026-04-16 17:51:51 +0000 UTC" firstStartedPulling="2026-04-16 17:51:51.951688711 +0000 UTC m=+709.968156210" lastFinishedPulling="2026-04-16 17:51:54.671424161 +0000 UTC m=+712.687891659" observedRunningTime="2026-04-16 17:51:54.893462727 +0000 UTC m=+712.909930248" watchObservedRunningTime="2026-04-16 17:51:54.895556604 +0000 UTC m=+712.912024124" Apr 16 17:51:55.780735 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:55.780695 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:55.785570 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:55.785545 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:55.869297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:55.869266 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:51:55.870405 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:51:55.870384 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-4ltxc" Apr 16 17:52:00.279708 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.279674 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t"] Apr 16 17:52:00.284403 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.284381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.287328 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.287307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:52:00.287433 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.287387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:52:00.288425 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.288411 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l58bq\"" Apr 16 17:52:00.294555 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.294533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t"] Apr 16 17:52:00.354477 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.354439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkf8\" (UniqueName: \"kubernetes.io/projected/dafa07e0-963e-4799-884d-1fa068eed328-kube-api-access-2tkf8\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.354656 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.354504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.354656 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.354608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.379189 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.379151 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8"] Apr 16 17:52:00.382673 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.382657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.394871 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.394843 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8"] Apr 16 17:52:00.455918 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.455881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.456077 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.455966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.456077 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.456032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.456203 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.456122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.456203 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.456192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkf8\" (UniqueName: \"kubernetes.io/projected/dafa07e0-963e-4799-884d-1fa068eed328-kube-api-access-2tkf8\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.456276 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.456233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zjnq\" (UniqueName: \"kubernetes.io/projected/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-kube-api-access-2zjnq\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.456320 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.456297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.456369 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.456354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.472706 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.472673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkf8\" (UniqueName: \"kubernetes.io/projected/dafa07e0-963e-4799-884d-1fa068eed328-kube-api-access-2tkf8\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.497930 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.497871 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf"] Apr 16 17:52:00.501537 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.501521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.512673 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.512648 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf"] Apr 16 17:52:00.556931 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.556814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.556931 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.556877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zjnq\" (UniqueName: \"kubernetes.io/projected/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-kube-api-access-2zjnq\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.557168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.556950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.557168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.556985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.557168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.557013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p8tx\" (UniqueName: \"kubernetes.io/projected/054c19e2-d25e-4ec9-b522-205e31675425-kube-api-access-2p8tx\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.557168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.557053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.557380 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.557341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.557441 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.557350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.570686 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.570649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zjnq\" (UniqueName: \"kubernetes.io/projected/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-kube-api-access-2zjnq\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.576515 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.576483 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc"] Apr 16 17:52:00.580278 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.580258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.590663 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.590641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc"] Apr 16 17:52:00.595881 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.595856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:00.658014 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.657970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrss\" (UniqueName: \"kubernetes.io/projected/cb996290-8a2f-4cae-b546-78bcf0c74441-kube-api-access-hnrss\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.658169 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.658169 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p8tx\" (UniqueName: \"kubernetes.io/projected/054c19e2-d25e-4ec9-b522-205e31675425-kube-api-access-2p8tx\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.658169 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.658169 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.658393 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.658550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.658689 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.658570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.668727 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.668609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p8tx\" (UniqueName: \"kubernetes.io/projected/054c19e2-d25e-4ec9-b522-205e31675425-kube-api-access-2p8tx\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.691226 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.691195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:00.733842 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.733778 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t"] Apr 16 17:52:00.735953 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:52:00.735920 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddafa07e0_963e_4799_884d_1fa068eed328.slice/crio-d868aaeb3659caa1bd5779ef181be3a17d1e8d37c220e5c861b0835e218e0102 WatchSource:0}: Error finding container d868aaeb3659caa1bd5779ef181be3a17d1e8d37c220e5c861b0835e218e0102: Status 404 returned error can't find the container with id d868aaeb3659caa1bd5779ef181be3a17d1e8d37c220e5c861b0835e218e0102 Apr 16 17:52:00.759550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.759498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.759683 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.759549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.759820 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.759783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrss\" (UniqueName: \"kubernetes.io/projected/cb996290-8a2f-4cae-b546-78bcf0c74441-kube-api-access-hnrss\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.760021 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.760002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.760124 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.760033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.772660 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.772634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrss\" (UniqueName: \"kubernetes.io/projected/cb996290-8a2f-4cae-b546-78bcf0c74441-kube-api-access-hnrss\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.812051 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.812027 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:00.830587 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.830560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8"] Apr 16 17:52:00.833969 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:52:00.833929 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4888d6d1_3b5c_4e7c_a1a1_7c7e173c98f5.slice/crio-244ff504cff2f438d12c9009f187b5e0ca6d4aa88067e8918d8f0102b1f514c2 WatchSource:0}: Error finding container 244ff504cff2f438d12c9009f187b5e0ca6d4aa88067e8918d8f0102b1f514c2: Status 404 returned error can't find the container with id 244ff504cff2f438d12c9009f187b5e0ca6d4aa88067e8918d8f0102b1f514c2 Apr 16 17:52:00.890656 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.890631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:00.891962 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.891937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" event={"ID":"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5","Type":"ContainerStarted","Data":"244ff504cff2f438d12c9009f187b5e0ca6d4aa88067e8918d8f0102b1f514c2"} Apr 16 17:52:00.893302 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.893276 2576 generic.go:358] "Generic (PLEG): container finished" podID="dafa07e0-963e-4799-884d-1fa068eed328" containerID="d2bb7ab02c676f20b72364defb4da88e3a8b905c585a251619e3054693cf9c0e" exitCode=0 Apr 16 17:52:00.893430 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.893357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" event={"ID":"dafa07e0-963e-4799-884d-1fa068eed328","Type":"ContainerDied","Data":"d2bb7ab02c676f20b72364defb4da88e3a8b905c585a251619e3054693cf9c0e"} Apr 16 17:52:00.893430 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.893384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" event={"ID":"dafa07e0-963e-4799-884d-1fa068eed328","Type":"ContainerStarted","Data":"d868aaeb3659caa1bd5779ef181be3a17d1e8d37c220e5c861b0835e218e0102"} Apr 16 17:52:00.956413 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:00.956009 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf"] Apr 16 17:52:00.959583 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:52:00.959552 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod054c19e2_d25e_4ec9_b522_205e31675425.slice/crio-781b68a4dde2a020708c977a8834228ee3b40daa4a066143cb135e3ac1c6fec2 WatchSource:0}: Error finding container 781b68a4dde2a020708c977a8834228ee3b40daa4a066143cb135e3ac1c6fec2: Status 404 returned error can't find the container with id 781b68a4dde2a020708c977a8834228ee3b40daa4a066143cb135e3ac1c6fec2 Apr 16 17:52:01.042602 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.042575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc"] Apr 16 17:52:01.050214 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:52:01.050187 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb996290_8a2f_4cae_b546_78bcf0c74441.slice/crio-0722c398ae8ad236b2c61672804b3638d3a90f15d21136611f3e1bda9f70d216 WatchSource:0}: Error finding container 0722c398ae8ad236b2c61672804b3638d3a90f15d21136611f3e1bda9f70d216: Status 404 returned error can't find the container with id 0722c398ae8ad236b2c61672804b3638d3a90f15d21136611f3e1bda9f70d216 Apr 16 17:52:01.899119 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.899081 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerID="96c975157b5fa2528454df437e72e8be83a35b00c6eb9532a2f5d5441dcdc6dd" exitCode=0 Apr 16 17:52:01.899472 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.899162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" event={"ID":"cb996290-8a2f-4cae-b546-78bcf0c74441","Type":"ContainerDied","Data":"96c975157b5fa2528454df437e72e8be83a35b00c6eb9532a2f5d5441dcdc6dd"} Apr 16 17:52:01.899472 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.899200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" event={"ID":"cb996290-8a2f-4cae-b546-78bcf0c74441","Type":"ContainerStarted","Data":"0722c398ae8ad236b2c61672804b3638d3a90f15d21136611f3e1bda9f70d216"} Apr 16 17:52:01.900864 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.900838 2576 generic.go:358] "Generic (PLEG): container finished" podID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerID="e026a653935c5dee475e949205af0dcf59e21a6fb4ee44f85d645a7959c556a4" exitCode=0 Apr 16 17:52:01.900974 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.900956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" event={"ID":"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5","Type":"ContainerDied","Data":"e026a653935c5dee475e949205af0dcf59e21a6fb4ee44f85d645a7959c556a4"} Apr 16 17:52:01.903206 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.903184 2576 generic.go:358] "Generic (PLEG): container finished" podID="054c19e2-d25e-4ec9-b522-205e31675425" containerID="24bdf955fb1cf1217d7807548cd8e53ce96b0eeb418b7197cad6e66f19e783fd" exitCode=0 Apr 16 17:52:01.903343 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.903275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" event={"ID":"054c19e2-d25e-4ec9-b522-205e31675425","Type":"ContainerDied","Data":"24bdf955fb1cf1217d7807548cd8e53ce96b0eeb418b7197cad6e66f19e783fd"} Apr 16 17:52:01.903343 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:01.903310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" event={"ID":"054c19e2-d25e-4ec9-b522-205e31675425","Type":"ContainerStarted","Data":"781b68a4dde2a020708c977a8834228ee3b40daa4a066143cb135e3ac1c6fec2"} Apr 16 17:52:02.908372 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:02.908338 2576 generic.go:358] "Generic (PLEG): container finished" podID="dafa07e0-963e-4799-884d-1fa068eed328" containerID="868bace8052f696a13ca75e0dfe103988cff42f6c24709ba8ed839de55929a6f" exitCode=0 Apr 16 17:52:02.908675 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:02.908419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" event={"ID":"dafa07e0-963e-4799-884d-1fa068eed328","Type":"ContainerDied","Data":"868bace8052f696a13ca75e0dfe103988cff42f6c24709ba8ed839de55929a6f"} Apr 16 17:52:03.914227 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.914192 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerID="7a433196d1d24a53e7e7747f66f7d726f491eabe47645f04aea94807c38a86ca" exitCode=0 Apr 16 17:52:03.914678 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.914274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" event={"ID":"cb996290-8a2f-4cae-b546-78bcf0c74441","Type":"ContainerDied","Data":"7a433196d1d24a53e7e7747f66f7d726f491eabe47645f04aea94807c38a86ca"} Apr 16 17:52:03.915982 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.915843 2576 generic.go:358] "Generic (PLEG): container finished" podID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerID="3df526dbc6962f3e9f22e3cb9f1ee792c5d89c43e64add0be95040c16162b98f" exitCode=0 Apr 16 17:52:03.915982 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.915903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" event={"ID":"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5","Type":"ContainerDied","Data":"3df526dbc6962f3e9f22e3cb9f1ee792c5d89c43e64add0be95040c16162b98f"} Apr 16 17:52:03.917667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.917648 2576 generic.go:358] "Generic (PLEG): container finished" podID="dafa07e0-963e-4799-884d-1fa068eed328" containerID="846a257dcd52753285001eadd95a5849278b7a3fcf2a190a72d814f598fc82e9" exitCode=0 Apr 16 17:52:03.917755 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.917732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" event={"ID":"dafa07e0-963e-4799-884d-1fa068eed328","Type":"ContainerDied","Data":"846a257dcd52753285001eadd95a5849278b7a3fcf2a190a72d814f598fc82e9"} Apr 16 17:52:03.919242 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.919214 2576 generic.go:358] "Generic (PLEG): container finished" podID="054c19e2-d25e-4ec9-b522-205e31675425" containerID="6e5e75b23514a194baac789086485a946c92b5013de679a41250d90676a9da3a" exitCode=0 Apr 16 17:52:03.919338 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:03.919287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" event={"ID":"054c19e2-d25e-4ec9-b522-205e31675425","Type":"ContainerDied","Data":"6e5e75b23514a194baac789086485a946c92b5013de679a41250d90676a9da3a"} Apr 16 17:52:04.924693 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:04.924659 2576 generic.go:358] "Generic (PLEG): container finished" podID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerID="ac05270fc3c3080f0aaf4e6b2f85feb1e217a4c00ea66e04cb025d74dfbd0b98" exitCode=0 Apr 16 17:52:04.925143 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:04.924732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" event={"ID":"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5","Type":"ContainerDied","Data":"ac05270fc3c3080f0aaf4e6b2f85feb1e217a4c00ea66e04cb025d74dfbd0b98"} Apr 16 17:52:04.926524 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:04.926503 2576 generic.go:358] "Generic (PLEG): container finished" podID="054c19e2-d25e-4ec9-b522-205e31675425" containerID="4fc0f67c9c505924b92cde062a11b15698be765b9dbf9f8cdb7af1bfb9fbba30" exitCode=0 Apr 16 17:52:04.926649 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:04.926618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" event={"ID":"054c19e2-d25e-4ec9-b522-205e31675425","Type":"ContainerDied","Data":"4fc0f67c9c505924b92cde062a11b15698be765b9dbf9f8cdb7af1bfb9fbba30"} Apr 16 17:52:04.928222 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:04.928204 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerID="fe79eeab17271d8f9bc9eb9a305bfba4606644b14047f8b82fb6d8d51b99b02b" exitCode=0 Apr 16 17:52:04.928305 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:04.928264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" event={"ID":"cb996290-8a2f-4cae-b546-78bcf0c74441","Type":"ContainerDied","Data":"fe79eeab17271d8f9bc9eb9a305bfba4606644b14047f8b82fb6d8d51b99b02b"} Apr 16 17:52:05.056294 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.056271 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:05.201503 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.201469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tkf8\" (UniqueName: \"kubernetes.io/projected/dafa07e0-963e-4799-884d-1fa068eed328-kube-api-access-2tkf8\") pod \"dafa07e0-963e-4799-884d-1fa068eed328\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " Apr 16 17:52:05.201680 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.201571 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-util\") pod \"dafa07e0-963e-4799-884d-1fa068eed328\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " Apr 16 17:52:05.201680 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.201605 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-bundle\") pod \"dafa07e0-963e-4799-884d-1fa068eed328\" (UID: \"dafa07e0-963e-4799-884d-1fa068eed328\") " Apr 16 17:52:05.202124 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.202099 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-bundle" (OuterVolumeSpecName: "bundle") pod "dafa07e0-963e-4799-884d-1fa068eed328" (UID: "dafa07e0-963e-4799-884d-1fa068eed328"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:05.203439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.203415 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafa07e0-963e-4799-884d-1fa068eed328-kube-api-access-2tkf8" (OuterVolumeSpecName: "kube-api-access-2tkf8") pod "dafa07e0-963e-4799-884d-1fa068eed328" (UID: "dafa07e0-963e-4799-884d-1fa068eed328"). InnerVolumeSpecName "kube-api-access-2tkf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:52:05.208704 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.208662 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-util" (OuterVolumeSpecName: "util") pod "dafa07e0-963e-4799-884d-1fa068eed328" (UID: "dafa07e0-963e-4799-884d-1fa068eed328"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:05.302991 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.302941 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tkf8\" (UniqueName: \"kubernetes.io/projected/dafa07e0-963e-4799-884d-1fa068eed328-kube-api-access-2tkf8\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:05.302991 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.302986 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:05.302991 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.302996 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dafa07e0-963e-4799-884d-1fa068eed328-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:05.934136 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.934101 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" Apr 16 17:52:05.934136 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.934112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767blzk5t" event={"ID":"dafa07e0-963e-4799-884d-1fa068eed328","Type":"ContainerDied","Data":"d868aaeb3659caa1bd5779ef181be3a17d1e8d37c220e5c861b0835e218e0102"} Apr 16 17:52:05.934620 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:05.934150 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d868aaeb3659caa1bd5779ef181be3a17d1e8d37c220e5c861b0835e218e0102" Apr 16 17:52:06.071123 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.071097 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:06.121393 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.121367 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:06.124506 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.124485 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:06.211204 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-util\") pod \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " Apr 16 17:52:06.211204 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211163 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-bundle\") pod \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " Apr 16 17:52:06.211204 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211192 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnrss\" (UniqueName: \"kubernetes.io/projected/cb996290-8a2f-4cae-b546-78bcf0c74441-kube-api-access-hnrss\") pod \"cb996290-8a2f-4cae-b546-78bcf0c74441\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " Apr 16 17:52:06.211495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211230 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zjnq\" (UniqueName: \"kubernetes.io/projected/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-kube-api-access-2zjnq\") pod \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\" (UID: \"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5\") " Apr 16 17:52:06.211495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211257 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-util\") pod \"054c19e2-d25e-4ec9-b522-205e31675425\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " Apr 16 17:52:06.211495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-bundle\") pod \"054c19e2-d25e-4ec9-b522-205e31675425\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " Apr 16 17:52:06.211495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211356 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-bundle\") pod \"cb996290-8a2f-4cae-b546-78bcf0c74441\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " Apr 16 17:52:06.211495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211377 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-util\") pod \"cb996290-8a2f-4cae-b546-78bcf0c74441\" (UID: \"cb996290-8a2f-4cae-b546-78bcf0c74441\") " Apr 16 17:52:06.211495 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211408 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p8tx\" (UniqueName: \"kubernetes.io/projected/054c19e2-d25e-4ec9-b522-205e31675425-kube-api-access-2p8tx\") pod \"054c19e2-d25e-4ec9-b522-205e31675425\" (UID: \"054c19e2-d25e-4ec9-b522-205e31675425\") " Apr 16 17:52:06.212110 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.211973 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-bundle" (OuterVolumeSpecName: "bundle") pod "4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" (UID: "4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:06.212579 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.212542 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-bundle" (OuterVolumeSpecName: "bundle") pod "054c19e2-d25e-4ec9-b522-205e31675425" (UID: "054c19e2-d25e-4ec9-b522-205e31675425"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:06.212793 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.212753 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-bundle" (OuterVolumeSpecName: "bundle") pod "cb996290-8a2f-4cae-b546-78bcf0c74441" (UID: "cb996290-8a2f-4cae-b546-78bcf0c74441"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:06.213688 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.213657 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054c19e2-d25e-4ec9-b522-205e31675425-kube-api-access-2p8tx" (OuterVolumeSpecName: "kube-api-access-2p8tx") pod "054c19e2-d25e-4ec9-b522-205e31675425" (UID: "054c19e2-d25e-4ec9-b522-205e31675425"). InnerVolumeSpecName "kube-api-access-2p8tx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:52:06.213805 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.213783 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb996290-8a2f-4cae-b546-78bcf0c74441-kube-api-access-hnrss" (OuterVolumeSpecName: "kube-api-access-hnrss") pod "cb996290-8a2f-4cae-b546-78bcf0c74441" (UID: "cb996290-8a2f-4cae-b546-78bcf0c74441"). InnerVolumeSpecName "kube-api-access-hnrss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:52:06.213879 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.213855 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-kube-api-access-2zjnq" (OuterVolumeSpecName: "kube-api-access-2zjnq") pod "4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" (UID: "4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5"). InnerVolumeSpecName "kube-api-access-2zjnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:52:06.217546 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.217502 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-util" (OuterVolumeSpecName: "util") pod "054c19e2-d25e-4ec9-b522-205e31675425" (UID: "054c19e2-d25e-4ec9-b522-205e31675425"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:06.217769 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.217749 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-util" (OuterVolumeSpecName: "util") pod "4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" (UID: "4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:06.218096 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.218077 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-util" (OuterVolumeSpecName: "util") pod "cb996290-8a2f-4cae-b546-78bcf0c74441" (UID: "cb996290-8a2f-4cae-b546-78bcf0c74441"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:52:06.312439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312401 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312432 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312440 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb996290-8a2f-4cae-b546-78bcf0c74441-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312439 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312449 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2p8tx\" (UniqueName: \"kubernetes.io/projected/054c19e2-d25e-4ec9-b522-205e31675425-kube-api-access-2p8tx\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312461 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312473 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-bundle\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312483 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnrss\" (UniqueName: \"kubernetes.io/projected/cb996290-8a2f-4cae-b546-78bcf0c74441-kube-api-access-hnrss\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312491 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zjnq\" (UniqueName: \"kubernetes.io/projected/4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5-kube-api-access-2zjnq\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.312737 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.312501 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/054c19e2-d25e-4ec9-b522-205e31675425-util\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:52:06.939323 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.939295 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" Apr 16 17:52:06.939758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.939286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30nnlh8" event={"ID":"4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5","Type":"ContainerDied","Data":"244ff504cff2f438d12c9009f187b5e0ca6d4aa88067e8918d8f0102b1f514c2"} Apr 16 17:52:06.939758 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.939417 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244ff504cff2f438d12c9009f187b5e0ca6d4aa88067e8918d8f0102b1f514c2" Apr 16 17:52:06.940977 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.940950 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" Apr 16 17:52:06.940977 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.940957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88txflf" event={"ID":"054c19e2-d25e-4ec9-b522-205e31675425","Type":"ContainerDied","Data":"781b68a4dde2a020708c977a8834228ee3b40daa4a066143cb135e3ac1c6fec2"} Apr 16 17:52:06.941148 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.940987 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="781b68a4dde2a020708c977a8834228ee3b40daa4a066143cb135e3ac1c6fec2" Apr 16 17:52:06.942825 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.942801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" event={"ID":"cb996290-8a2f-4cae-b546-78bcf0c74441","Type":"ContainerDied","Data":"0722c398ae8ad236b2c61672804b3638d3a90f15d21136611f3e1bda9f70d216"} Apr 16 17:52:06.942968 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.942829 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0722c398ae8ad236b2c61672804b3638d3a90f15d21136611f3e1bda9f70d216" Apr 16 17:52:06.942968 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:06.942881 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503sghvc" Apr 16 17:52:12.458708 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.458665 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5"] Apr 16 17:52:12.459187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459137 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="util" Apr 16 17:52:12.459187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459153 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="util" Apr 16 17:52:12.459187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459163 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="pull" Apr 16 17:52:12.459187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459168 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="pull" Apr 16 17:52:12.459187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459186 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="extract" Apr 16 17:52:12.459187 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459193 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459201 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="util" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459206 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="util" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459212 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="util" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459217 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="util" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459223 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="pull" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459230 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="pull" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459236 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="pull" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459241 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="pull" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459249 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459258 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459274 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="pull" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459281 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="pull" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459289 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459297 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459307 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459314 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="extract" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459336 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="util" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459343 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="util" Apr 16 17:52:12.459422 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459429 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb996290-8a2f-4cae-b546-78bcf0c74441" containerName="extract" Apr 16 17:52:12.459997 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459443 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="054c19e2-d25e-4ec9-b522-205e31675425" containerName="extract" Apr 16 17:52:12.459997 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459449 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4888d6d1-3b5c-4e7c-a1a1-7c7e173c98f5" containerName="extract" Apr 16 17:52:12.459997 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.459455 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dafa07e0-963e-4799-884d-1fa068eed328" containerName="extract" Apr 16 17:52:12.463755 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.463737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:12.466503 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.466481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 17:52:12.467736 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.467714 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 17:52:12.467813 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.467718 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-f9t7d\"" Apr 16 17:52:12.477946 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.477924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5"] Apr 16 17:52:12.568615 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.568573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6x5\" (UniqueName: \"kubernetes.io/projected/f28ebcd1-ba12-4f95-ba66-e35818bba52a-kube-api-access-6b6x5\") pod \"limitador-operator-controller-manager-c7fb4c8d5-7j8n5\" (UID: \"f28ebcd1-ba12-4f95-ba66-e35818bba52a\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:12.669282 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.669239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6x5\" (UniqueName: \"kubernetes.io/projected/f28ebcd1-ba12-4f95-ba66-e35818bba52a-kube-api-access-6b6x5\") pod \"limitador-operator-controller-manager-c7fb4c8d5-7j8n5\" (UID: \"f28ebcd1-ba12-4f95-ba66-e35818bba52a\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:12.679927 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.679893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6x5\" (UniqueName: \"kubernetes.io/projected/f28ebcd1-ba12-4f95-ba66-e35818bba52a-kube-api-access-6b6x5\") pod \"limitador-operator-controller-manager-c7fb4c8d5-7j8n5\" (UID: \"f28ebcd1-ba12-4f95-ba66-e35818bba52a\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:12.774097 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.774007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:12.910128 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.910100 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5"] Apr 16 17:52:12.911853 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:52:12.911820 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf28ebcd1_ba12_4f95_ba66_e35818bba52a.slice/crio-f7f6b29a1bd671fb7ce565baa17bc51239faf740875101c4f050db4bb73e3b20 WatchSource:0}: Error finding container f7f6b29a1bd671fb7ce565baa17bc51239faf740875101c4f050db4bb73e3b20: Status 404 returned error can't find the container with id f7f6b29a1bd671fb7ce565baa17bc51239faf740875101c4f050db4bb73e3b20 Apr 16 17:52:12.965261 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:12.965227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" event={"ID":"f28ebcd1-ba12-4f95-ba66-e35818bba52a","Type":"ContainerStarted","Data":"f7f6b29a1bd671fb7ce565baa17bc51239faf740875101c4f050db4bb73e3b20"} Apr 16 17:52:15.984606 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:15.984568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" event={"ID":"f28ebcd1-ba12-4f95-ba66-e35818bba52a","Type":"ContainerStarted","Data":"6b335dfd3c98701426d4c6fdee22636adb68487c8b34480eec5c106855f22112"} Apr 16 17:52:15.985107 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:15.984662 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:16.006421 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:16.006367 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" podStartSLOduration=1.633733437 podStartE2EDuration="4.00635007s" podCreationTimestamp="2026-04-16 17:52:12 +0000 UTC" firstStartedPulling="2026-04-16 17:52:12.913992487 +0000 UTC m=+730.930459986" lastFinishedPulling="2026-04-16 17:52:15.28660911 +0000 UTC m=+733.303076619" observedRunningTime="2026-04-16 17:52:16.006216918 +0000 UTC m=+734.022684441" watchObservedRunningTime="2026-04-16 17:52:16.00635007 +0000 UTC m=+734.022817604" Apr 16 17:52:21.082598 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.082562 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t"] Apr 16 17:52:21.090452 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.090435 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.093569 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.093546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-zb8l5\"" Apr 16 17:52:21.102692 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.102668 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t"] Apr 16 17:52:21.247631 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.247596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b9a6220d-cf89-4be1-adf4-f1bd40e9b09e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bw45t\" (UID: \"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.247794 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.247681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjxmt\" (UniqueName: \"kubernetes.io/projected/b9a6220d-cf89-4be1-adf4-f1bd40e9b09e-kube-api-access-kjxmt\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bw45t\" (UID: \"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.348805 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.348701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b9a6220d-cf89-4be1-adf4-f1bd40e9b09e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bw45t\" (UID: \"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.348954 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.348803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjxmt\" (UniqueName: \"kubernetes.io/projected/b9a6220d-cf89-4be1-adf4-f1bd40e9b09e-kube-api-access-kjxmt\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bw45t\" (UID: \"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.349161 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.349141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b9a6220d-cf89-4be1-adf4-f1bd40e9b09e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bw45t\" (UID: \"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.359452 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.359418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjxmt\" (UniqueName: \"kubernetes.io/projected/b9a6220d-cf89-4be1-adf4-f1bd40e9b09e-kube-api-access-kjxmt\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bw45t\" (UID: \"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.400586 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.400545 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:21.535099 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:21.535073 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t"] Apr 16 17:52:21.536679 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:52:21.536644 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a6220d_cf89_4be1_adf4_f1bd40e9b09e.slice/crio-d9e150d6cafc943b38d4cb7f996332f56858f3dd3e9ebca16420ec89ff330609 WatchSource:0}: Error finding container d9e150d6cafc943b38d4cb7f996332f56858f3dd3e9ebca16420ec89ff330609: Status 404 returned error can't find the container with id d9e150d6cafc943b38d4cb7f996332f56858f3dd3e9ebca16420ec89ff330609 Apr 16 17:52:22.009182 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:22.009141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" event={"ID":"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e","Type":"ContainerStarted","Data":"d9e150d6cafc943b38d4cb7f996332f56858f3dd3e9ebca16420ec89ff330609"} Apr 16 17:52:26.029378 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:26.029340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" event={"ID":"b9a6220d-cf89-4be1-adf4-f1bd40e9b09e","Type":"ContainerStarted","Data":"d58d8934dffdda7334148a1783ce180511485f3a612ce021d00e6184e18de6cc"} Apr 16 17:52:26.029759 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:26.029446 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:52:26.064569 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:26.064517 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" podStartSLOduration=1.286373401 podStartE2EDuration="5.064501945s" podCreationTimestamp="2026-04-16 17:52:21 +0000 UTC" firstStartedPulling="2026-04-16 17:52:21.539313997 +0000 UTC m=+739.555781500" lastFinishedPulling="2026-04-16 17:52:25.317442539 +0000 UTC m=+743.333910044" observedRunningTime="2026-04-16 17:52:26.062813579 +0000 UTC m=+744.079281102" watchObservedRunningTime="2026-04-16 17:52:26.064501945 +0000 UTC m=+744.080969486" Apr 16 17:52:26.990771 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:26.990741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-7j8n5" Apr 16 17:52:37.036833 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:52:37.036800 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bw45t" Apr 16 17:53:10.695729 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.695696 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-5fjmk"] Apr 16 17:53:10.699407 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.699390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:10.702627 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.702602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-j2xjx\"" Apr 16 17:53:10.708932 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.708892 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-5fjmk"] Apr 16 17:53:10.863255 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.863216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qr9x\" (UniqueName: \"kubernetes.io/projected/268f74e3-65fa-4f71-9285-fe00348457e8-kube-api-access-8qr9x\") pod \"authorino-79cbc94b89-5fjmk\" (UID: \"268f74e3-65fa-4f71-9285-fe00348457e8\") " pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:10.963842 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.963750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qr9x\" (UniqueName: \"kubernetes.io/projected/268f74e3-65fa-4f71-9285-fe00348457e8-kube-api-access-8qr9x\") pod \"authorino-79cbc94b89-5fjmk\" (UID: \"268f74e3-65fa-4f71-9285-fe00348457e8\") " pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:10.974262 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:10.974231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qr9x\" (UniqueName: \"kubernetes.io/projected/268f74e3-65fa-4f71-9285-fe00348457e8-kube-api-access-8qr9x\") pod \"authorino-79cbc94b89-5fjmk\" (UID: \"268f74e3-65fa-4f71-9285-fe00348457e8\") " pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:11.015254 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:11.015218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:11.143572 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:11.143548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-5fjmk"] Apr 16 17:53:11.145089 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:53:11.145066 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod268f74e3_65fa_4f71_9285_fe00348457e8.slice/crio-7cf46455208713e38306117cfff32498205d25d4802be9f1713c03f6e8bc0498 WatchSource:0}: Error finding container 7cf46455208713e38306117cfff32498205d25d4802be9f1713c03f6e8bc0498: Status 404 returned error can't find the container with id 7cf46455208713e38306117cfff32498205d25d4802be9f1713c03f6e8bc0498 Apr 16 17:53:11.146305 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:11.146285 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:53:11.195740 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:11.195702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" event={"ID":"268f74e3-65fa-4f71-9285-fe00348457e8","Type":"ContainerStarted","Data":"7cf46455208713e38306117cfff32498205d25d4802be9f1713c03f6e8bc0498"} Apr 16 17:53:14.210290 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:14.210254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" event={"ID":"268f74e3-65fa-4f71-9285-fe00348457e8","Type":"ContainerStarted","Data":"599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45"} Apr 16 17:53:14.227812 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:14.227751 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" podStartSLOduration=2.02140014 podStartE2EDuration="4.227732265s" podCreationTimestamp="2026-04-16 17:53:10 +0000 UTC" firstStartedPulling="2026-04-16 17:53:11.146417127 +0000 UTC m=+789.162884626" lastFinishedPulling="2026-04-16 17:53:13.352749249 +0000 UTC m=+791.369216751" observedRunningTime="2026-04-16 17:53:14.226877476 +0000 UTC m=+792.243344997" watchObservedRunningTime="2026-04-16 17:53:14.227732265 +0000 UTC m=+792.244199787" Apr 16 17:53:34.848598 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:34.848514 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-5fjmk"] Apr 16 17:53:34.849142 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:34.848729 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" podUID="268f74e3-65fa-4f71-9285-fe00348457e8" containerName="authorino" containerID="cri-o://599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45" gracePeriod=30 Apr 16 17:53:35.098028 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.098001 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:35.165112 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.165080 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qr9x\" (UniqueName: \"kubernetes.io/projected/268f74e3-65fa-4f71-9285-fe00348457e8-kube-api-access-8qr9x\") pod \"268f74e3-65fa-4f71-9285-fe00348457e8\" (UID: \"268f74e3-65fa-4f71-9285-fe00348457e8\") " Apr 16 17:53:35.167267 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.167235 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268f74e3-65fa-4f71-9285-fe00348457e8-kube-api-access-8qr9x" (OuterVolumeSpecName: "kube-api-access-8qr9x") pod "268f74e3-65fa-4f71-9285-fe00348457e8" (UID: "268f74e3-65fa-4f71-9285-fe00348457e8"). InnerVolumeSpecName "kube-api-access-8qr9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:53:35.266612 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.266575 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qr9x\" (UniqueName: \"kubernetes.io/projected/268f74e3-65fa-4f71-9285-fe00348457e8-kube-api-access-8qr9x\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:35.289842 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.289809 2576 generic.go:358] "Generic (PLEG): container finished" podID="268f74e3-65fa-4f71-9285-fe00348457e8" containerID="599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45" exitCode=0 Apr 16 17:53:35.290026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.289866 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" Apr 16 17:53:35.290026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.289872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" event={"ID":"268f74e3-65fa-4f71-9285-fe00348457e8","Type":"ContainerDied","Data":"599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45"} Apr 16 17:53:35.290026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.289899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-5fjmk" event={"ID":"268f74e3-65fa-4f71-9285-fe00348457e8","Type":"ContainerDied","Data":"7cf46455208713e38306117cfff32498205d25d4802be9f1713c03f6e8bc0498"} Apr 16 17:53:35.290026 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.289933 2576 scope.go:117] "RemoveContainer" containerID="599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45" Apr 16 17:53:35.299102 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.299049 2576 scope.go:117] "RemoveContainer" containerID="599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45" Apr 16 17:53:35.299337 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:53:35.299315 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45\": container with ID starting with 599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45 not found: ID does not exist" containerID="599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45" Apr 16 17:53:35.299400 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.299350 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45"} err="failed to get container status \"599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45\": rpc error: code = NotFound desc = could not find container \"599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45\": container with ID starting with 599c67870fb7a522d6cd21bbe725a16708e78f0095bdae674e230b7fbfd8dc45 not found: ID does not exist" Apr 16 17:53:35.313070 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.313040 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-5fjmk"] Apr 16 17:53:35.319662 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:35.319630 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-5fjmk"] Apr 16 17:53:36.632185 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:36.628288 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268f74e3-65fa-4f71-9285-fe00348457e8" path="/var/lib/kubelet/pods/268f74e3-65fa-4f71-9285-fe00348457e8/volumes" Apr 16 17:53:43.174794 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.174754 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4"] Apr 16 17:53:43.175303 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.175117 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="268f74e3-65fa-4f71-9285-fe00348457e8" containerName="authorino" Apr 16 17:53:43.175303 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.175128 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="268f74e3-65fa-4f71-9285-fe00348457e8" containerName="authorino" Apr 16 17:53:43.175303 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.175182 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="268f74e3-65fa-4f71-9285-fe00348457e8" containerName="authorino" Apr 16 17:53:43.179631 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.179607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.204799 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.202836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4"] Apr 16 17:53:43.234301 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.234494 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.234494 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.234494 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.234494 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234477 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rccsj\" (UniqueName: \"kubernetes.io/projected/e37d1415-b492-405e-9870-3b7c399f5c0d-kube-api-access-rccsj\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.234685 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e37d1415-b492-405e-9870-3b7c399f5c0d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.234685 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.234670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336139 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e37d1415-b492-405e-9870-3b7c399f5c0d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336323 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336323 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336323 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336323 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336537 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.336537 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.336383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rccsj\" (UniqueName: \"kubernetes.io/projected/e37d1415-b492-405e-9870-3b7c399f5c0d-kube-api-access-rccsj\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.337085 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.337057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.338611 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.338583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.338711 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.338652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.338711 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.338618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e37d1415-b492-405e-9870-3b7c399f5c0d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.338825 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.338763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.346174 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.346150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e37d1415-b492-405e-9870-3b7c399f5c0d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.347033 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.347008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rccsj\" (UniqueName: \"kubernetes.io/projected/e37d1415-b492-405e-9870-3b7c399f5c0d-kube-api-access-rccsj\") pod \"istiod-openshift-gateway-55ff986f96-4xmn4\" (UID: \"e37d1415-b492-405e-9870-3b7c399f5c0d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.489250 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.489146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:43.658708 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.658679 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4"] Apr 16 17:53:43.661491 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:53:43.661443 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37d1415_b492_405e_9870_3b7c399f5c0d.slice/crio-112cea5512122d8fc358030fc9e84da72ec63ea88088a2e22597aeb71a89707b WatchSource:0}: Error finding container 112cea5512122d8fc358030fc9e84da72ec63ea88088a2e22597aeb71a89707b: Status 404 returned error can't find the container with id 112cea5512122d8fc358030fc9e84da72ec63ea88088a2e22597aeb71a89707b Apr 16 17:53:43.664045 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.664007 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:53:43.664143 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:43.664083 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:53:44.328030 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:44.327992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" event={"ID":"e37d1415-b492-405e-9870-3b7c399f5c0d","Type":"ContainerStarted","Data":"dc9ce6378955703a157a0f0e71519bbf489f621f3166bf70ca2a2ee41541e1b5"} Apr 16 17:53:44.328030 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:44.328035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" event={"ID":"e37d1415-b492-405e-9870-3b7c399f5c0d","Type":"ContainerStarted","Data":"112cea5512122d8fc358030fc9e84da72ec63ea88088a2e22597aeb71a89707b"} Apr 16 17:53:44.328555 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:44.328236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:44.330210 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:44.330167 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-4xmn4 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 17:53:44.330344 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:44.330223 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" podUID="e37d1415-b492-405e-9870-3b7c399f5c0d" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:53:44.362839 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:44.362790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" podStartSLOduration=1.3627760150000001 podStartE2EDuration="1.362776015s" podCreationTimestamp="2026-04-16 17:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:53:44.359002522 +0000 UTC m=+822.375470046" watchObservedRunningTime="2026-04-16 17:53:44.362776015 +0000 UTC m=+822.379243547" Apr 16 17:53:45.333009 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.332978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4xmn4" Apr 16 17:53:45.449230 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.449193 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z"] Apr 16 17:53:45.449475 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.449435 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" podUID="00a0b91a-9bc8-4e6a-8270-bcf005283af3" containerName="discovery" containerID="cri-o://f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406" gracePeriod=30 Apr 16 17:53:45.694137 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.694109 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:53:45.758313 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758282 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-cacerts\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.758313 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758315 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-kubeconfig\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.758530 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758372 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a0b91a-9bc8-4e6a-8270-bcf005283af3-local-certs\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.758530 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758407 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-token\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.758530 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758435 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdlr8\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-kube-api-access-zdlr8\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.758530 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-dns-cert\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.758530 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.758529 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-ca-configmap\") pod \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\" (UID: \"00a0b91a-9bc8-4e6a-8270-bcf005283af3\") " Apr 16 17:53:45.759262 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.759070 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:53:45.761046 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.761014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-cacerts" (OuterVolumeSpecName: "cacerts") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:53:45.761165 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.761052 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:53:45.761165 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.761075 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:53:45.761165 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.761120 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-token" (OuterVolumeSpecName: "istio-token") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:53:45.761308 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.761181 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a0b91a-9bc8-4e6a-8270-bcf005283af3-local-certs" (OuterVolumeSpecName: "local-certs") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:53:45.761308 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.761260 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-kube-api-access-zdlr8" (OuterVolumeSpecName: "kube-api-access-zdlr8") pod "00a0b91a-9bc8-4e6a-8270-bcf005283af3" (UID: "00a0b91a-9bc8-4e6a-8270-bcf005283af3"). InnerVolumeSpecName "kube-api-access-zdlr8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:53:45.859650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859601 2576 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a0b91a-9bc8-4e6a-8270-bcf005283af3-local-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:45.859650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859643 2576 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-token\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:45.859650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859653 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdlr8\" (UniqueName: \"kubernetes.io/projected/00a0b91a-9bc8-4e6a-8270-bcf005283af3-kube-api-access-zdlr8\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:45.859650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859665 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-dns-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:45.859943 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859675 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-csr-ca-configmap\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:45.859943 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859684 2576 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-cacerts\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:45.859943 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:45.859692 2576 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00a0b91a-9bc8-4e6a-8270-bcf005283af3-istio-kubeconfig\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:53:46.337258 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.337222 2576 generic.go:358] "Generic (PLEG): container finished" podID="00a0b91a-9bc8-4e6a-8270-bcf005283af3" containerID="f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406" exitCode=0 Apr 16 17:53:46.337719 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.337298 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" Apr 16 17:53:46.337719 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.337308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" event={"ID":"00a0b91a-9bc8-4e6a-8270-bcf005283af3","Type":"ContainerDied","Data":"f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406"} Apr 16 17:53:46.337719 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.337351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z" event={"ID":"00a0b91a-9bc8-4e6a-8270-bcf005283af3","Type":"ContainerDied","Data":"9d96e5b30e1ad1ee5626c63cfb66c8f524abc48a079f82e27f29ac9e3c6a7cdd"} Apr 16 17:53:46.337719 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.337371 2576 scope.go:117] "RemoveContainer" containerID="f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406" Apr 16 17:53:46.346778 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.346756 2576 scope.go:117] "RemoveContainer" containerID="f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406" Apr 16 17:53:46.347056 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:53:46.347036 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406\": container with ID starting with f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406 not found: ID does not exist" containerID="f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406" Apr 16 17:53:46.347138 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.347071 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406"} err="failed to get container status \"f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406\": rpc error: code = NotFound desc = could not find container \"f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406\": container with ID starting with f6846be95aed1ec332b96ee3ecafae042c37052065e7dc686827452bce646406 not found: ID does not exist" Apr 16 17:53:46.366316 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.366264 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z"] Apr 16 17:53:46.378207 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.378167 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-t4r8z"] Apr 16 17:53:46.628150 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:46.628069 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a0b91a-9bc8-4e6a-8270-bcf005283af3" path="/var/lib/kubelet/pods/00a0b91a-9bc8-4e6a-8270-bcf005283af3/volumes" Apr 16 17:53:53.488697 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.488662 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5644d5958-zwnc7"] Apr 16 17:53:53.489161 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.489021 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00a0b91a-9bc8-4e6a-8270-bcf005283af3" containerName="discovery" Apr 16 17:53:53.489161 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.489034 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a0b91a-9bc8-4e6a-8270-bcf005283af3" containerName="discovery" Apr 16 17:53:53.489161 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.489093 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="00a0b91a-9bc8-4e6a-8270-bcf005283af3" containerName="discovery" Apr 16 17:53:53.493274 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.493250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.497173 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.497148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:53:53.497173 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.497160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 17:53:53.497372 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.497154 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-94czs\"" Apr 16 17:53:53.497372 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.497194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:53:53.507949 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.507923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5644d5958-zwnc7"] Apr 16 17:53:53.628795 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.628758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f61e9d-3dfb-429d-a797-917fb922e521-cert\") pod \"llmisvc-controller-manager-5644d5958-zwnc7\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.628995 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.628821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwxjx\" (UniqueName: \"kubernetes.io/projected/33f61e9d-3dfb-429d-a797-917fb922e521-kube-api-access-pwxjx\") pod \"llmisvc-controller-manager-5644d5958-zwnc7\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.729265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.729229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxjx\" (UniqueName: \"kubernetes.io/projected/33f61e9d-3dfb-429d-a797-917fb922e521-kube-api-access-pwxjx\") pod \"llmisvc-controller-manager-5644d5958-zwnc7\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.729453 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.729305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f61e9d-3dfb-429d-a797-917fb922e521-cert\") pod \"llmisvc-controller-manager-5644d5958-zwnc7\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.731692 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.731670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f61e9d-3dfb-429d-a797-917fb922e521-cert\") pod \"llmisvc-controller-manager-5644d5958-zwnc7\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.740924 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.740856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxjx\" (UniqueName: \"kubernetes.io/projected/33f61e9d-3dfb-429d-a797-917fb922e521-kube-api-access-pwxjx\") pod \"llmisvc-controller-manager-5644d5958-zwnc7\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.803051 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.803011 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:53.927173 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:53.927146 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5644d5958-zwnc7"] Apr 16 17:53:53.928599 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:53:53.928570 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod33f61e9d_3dfb_429d_a797_917fb922e521.slice/crio-9fa7ff118f597110fc4d8ea21b6aee5355a9a136d889a45169d3a5265bbeb079 WatchSource:0}: Error finding container 9fa7ff118f597110fc4d8ea21b6aee5355a9a136d889a45169d3a5265bbeb079: Status 404 returned error can't find the container with id 9fa7ff118f597110fc4d8ea21b6aee5355a9a136d889a45169d3a5265bbeb079 Apr 16 17:53:54.372843 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:54.372809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" event={"ID":"33f61e9d-3dfb-429d-a797-917fb922e521","Type":"ContainerStarted","Data":"9fa7ff118f597110fc4d8ea21b6aee5355a9a136d889a45169d3a5265bbeb079"} Apr 16 17:53:57.387111 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:57.387067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" event={"ID":"33f61e9d-3dfb-429d-a797-917fb922e521","Type":"ContainerStarted","Data":"2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac"} Apr 16 17:53:57.387577 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:57.387153 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:53:57.414211 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:53:57.414161 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" podStartSLOduration=1.61638712 podStartE2EDuration="4.414146381s" podCreationTimestamp="2026-04-16 17:53:53 +0000 UTC" firstStartedPulling="2026-04-16 17:53:53.929975895 +0000 UTC m=+831.946443394" lastFinishedPulling="2026-04-16 17:53:56.727735155 +0000 UTC m=+834.744202655" observedRunningTime="2026-04-16 17:53:57.412708654 +0000 UTC m=+835.429176186" watchObservedRunningTime="2026-04-16 17:53:57.414146381 +0000 UTC m=+835.430613902" Apr 16 17:54:28.393496 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:54:28.393466 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 17:55:02.539948 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:02.539856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:55:02.542554 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:02.542532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 17:55:40.503135 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.503096 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99"] Apr 16 17:55:40.507109 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.507081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.510003 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.509967 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-j52m5\"" Apr 16 17:55:40.510433 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.510412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 17:55:40.510558 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.510449 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:55:40.510673 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.510489 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:55:40.520423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.520401 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99"] Apr 16 17:55:40.549466 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6b214e90-043e-4e8e-8231-cf881f4d0988-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549950 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549950 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn85d\" (UniqueName: \"kubernetes.io/projected/6b214e90-043e-4e8e-8231-cf881f4d0988-kube-api-access-fn85d\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.549950 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.549795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.650977 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.650939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651197 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn85d\" (UniqueName: \"kubernetes.io/projected/6b214e90-043e-4e8e-8231-cf881f4d0988-kube-api-access-fn85d\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651197 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651197 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651197 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651594 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651594 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6b214e90-043e-4e8e-8231-cf881f4d0988-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.651791 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.651764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.652654 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.652628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6b214e90-043e-4e8e-8231-cf881f4d0988-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.652764 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.652741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.654148 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.654129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.654385 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.654362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.667456 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.667423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b214e90-043e-4e8e-8231-cf881f4d0988-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.667652 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.667629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn85d\" (UniqueName: \"kubernetes.io/projected/6b214e90-043e-4e8e-8231-cf881f4d0988-kube-api-access-fn85d\") pod \"router-gateway-1-openshift-default-6c59fbf55c-98r99\" (UID: \"6b214e90-043e-4e8e-8231-cf881f4d0988\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.821325 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.821229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:40.980756 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.980722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99"] Apr 16 17:55:40.982346 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:55:40.982322 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b214e90_043e_4e8e_8231_cf881f4d0988.slice/crio-d28b0f30d47316bc5311682c99799da54a9d05af939c097b8200b13bd92fbe9e WatchSource:0}: Error finding container d28b0f30d47316bc5311682c99799da54a9d05af939c097b8200b13bd92fbe9e: Status 404 returned error can't find the container with id d28b0f30d47316bc5311682c99799da54a9d05af939c097b8200b13bd92fbe9e Apr 16 17:55:40.984551 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.984513 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:55:40.984665 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.984586 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:55:40.984665 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:40.984622 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 17:55:41.797491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:41.797448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" event={"ID":"6b214e90-043e-4e8e-8231-cf881f4d0988","Type":"ContainerStarted","Data":"5592d87e352cfdb4ba890db2eaadb82cf7442a3c980af4056d1a4334d1117890"} Apr 16 17:55:41.797491 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:41.797494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" event={"ID":"6b214e90-043e-4e8e-8231-cf881f4d0988","Type":"ContainerStarted","Data":"d28b0f30d47316bc5311682c99799da54a9d05af939c097b8200b13bd92fbe9e"} Apr 16 17:55:41.820828 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:41.820755 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" podStartSLOduration=1.820739171 podStartE2EDuration="1.820739171s" podCreationTimestamp="2026-04-16 17:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:55:41.81764666 +0000 UTC m=+939.834114206" watchObservedRunningTime="2026-04-16 17:55:41.820739171 +0000 UTC m=+939.837206691" Apr 16 17:55:41.821848 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:41.821822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:42.826588 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:42.826558 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:43.807392 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:43.807360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:43.808349 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:43.808326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-98r99" Apr 16 17:55:58.708504 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.708462 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc"] Apr 16 17:55:58.711862 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.711844 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.716297 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.716277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 17:55:58.716417 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.716339 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 17:55:58.728695 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.727930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc"] Apr 16 17:55:58.809469 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.809418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.809667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.809494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533ced57-d063-40d3-9f84-47da5c390798-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.809667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.809542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.809797 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.809673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.809797 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.809734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.809797 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.809763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkpm\" (UniqueName: \"kubernetes.io/projected/533ced57-d063-40d3-9f84-47da5c390798-kube-api-access-xwkpm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.910485 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.910739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.910739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkpm\" (UniqueName: \"kubernetes.io/projected/533ced57-d063-40d3-9f84-47da5c390798-kube-api-access-xwkpm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.910739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.910739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533ced57-d063-40d3-9f84-47da5c390798-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.910739 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.911005 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.911005 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.910989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.911164 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.911144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.913077 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.913049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.913231 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.913213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533ced57-d063-40d3-9f84-47da5c390798-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:58.920354 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:58.920326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkpm\" (UniqueName: \"kubernetes.io/projected/533ced57-d063-40d3-9f84-47da5c390798-kube-api-access-xwkpm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:59.025736 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:59.025644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:55:59.161989 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:59.161958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc"] Apr 16 17:55:59.163234 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:55:59.163211 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533ced57_d063_40d3_9f84_47da5c390798.slice/crio-11e5f54652f0377458cb680a2ed700a46ead7b65c9b6dccb1a44ec0ccf700e48 WatchSource:0}: Error finding container 11e5f54652f0377458cb680a2ed700a46ead7b65c9b6dccb1a44ec0ccf700e48: Status 404 returned error can't find the container with id 11e5f54652f0377458cb680a2ed700a46ead7b65c9b6dccb1a44ec0ccf700e48 Apr 16 17:55:59.882409 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:55:59.882347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" event={"ID":"533ced57-d063-40d3-9f84-47da5c390798","Type":"ContainerStarted","Data":"11e5f54652f0377458cb680a2ed700a46ead7b65c9b6dccb1a44ec0ccf700e48"} Apr 16 17:56:03.903508 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:03.903469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" event={"ID":"533ced57-d063-40d3-9f84-47da5c390798","Type":"ContainerStarted","Data":"797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f"} Apr 16 17:56:07.921071 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:07.921031 2576 generic.go:358] "Generic (PLEG): container finished" podID="533ced57-d063-40d3-9f84-47da5c390798" containerID="797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f" exitCode=0 Apr 16 17:56:07.921415 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:07.921088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" event={"ID":"533ced57-d063-40d3-9f84-47da5c390798","Type":"ContainerDied","Data":"797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f"} Apr 16 17:56:09.931349 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:09.931306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" event={"ID":"533ced57-d063-40d3-9f84-47da5c390798","Type":"ContainerStarted","Data":"d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a"} Apr 16 17:56:09.954112 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:09.954061 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" podStartSLOduration=1.9847683200000001 podStartE2EDuration="11.95404518s" podCreationTimestamp="2026-04-16 17:55:58 +0000 UTC" firstStartedPulling="2026-04-16 17:55:59.165066934 +0000 UTC m=+957.181534436" lastFinishedPulling="2026-04-16 17:56:09.13434379 +0000 UTC m=+967.150811296" observedRunningTime="2026-04-16 17:56:09.95340298 +0000 UTC m=+967.969870498" watchObservedRunningTime="2026-04-16 17:56:09.95404518 +0000 UTC m=+967.970512766" Apr 16 17:56:19.026615 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:19.026577 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:56:19.026615 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:19.026620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:56:19.039219 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:19.039196 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:56:19.982121 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:19.982093 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:56:55.154581 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.154488 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm"] Apr 16 17:56:55.158409 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.158388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.161099 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.161074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-vlt5k\"" Apr 16 17:56:55.161231 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.161189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 17:56:55.171184 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.171154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm"] Apr 16 17:56:55.208446 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.208408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.208446 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.208447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pfhg\" (UniqueName: \"kubernetes.io/projected/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kube-api-access-8pfhg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.208698 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.208579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.208698 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.208691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.208783 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.208715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.208783 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.208734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.309979 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.309933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310269 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310269 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310269 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pfhg\" (UniqueName: \"kubernetes.io/projected/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kube-api-access-8pfhg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310445 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310502 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310618 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.310681 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.310611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.312765 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.312742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.320901 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.320871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pfhg\" (UniqueName: \"kubernetes.io/projected/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kube-api-access-8pfhg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.469344 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.469248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:56:55.613028 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:55.612999 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm"] Apr 16 17:56:55.615752 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:56:55.615716 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6467e6a9_5007_4d46_9ca1_cec8a6b474df.slice/crio-12b0faba289f64f36fc57bdf53814a0ce866338d57dc3551fa063c5ef5b7c302 WatchSource:0}: Error finding container 12b0faba289f64f36fc57bdf53814a0ce866338d57dc3551fa063c5ef5b7c302: Status 404 returned error can't find the container with id 12b0faba289f64f36fc57bdf53814a0ce866338d57dc3551fa063c5ef5b7c302 Apr 16 17:56:56.113591 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:56.113539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerStarted","Data":"0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8"} Apr 16 17:56:56.113591 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:56.113589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerStarted","Data":"12b0faba289f64f36fc57bdf53814a0ce866338d57dc3551fa063c5ef5b7c302"} Apr 16 17:56:57.118743 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:57.118705 2576 generic.go:358] "Generic (PLEG): container finished" podID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerID="0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8" exitCode=0 Apr 16 17:56:57.119214 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:57.118786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerDied","Data":"0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8"} Apr 16 17:56:58.125032 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:56:58.125003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerStarted","Data":"df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db"} Apr 16 17:57:27.709248 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:27.709200 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc"] Apr 16 17:57:27.709788 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:27.709495 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" podUID="533ced57-d063-40d3-9f84-47da5c390798" containerName="main" containerID="cri-o://d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a" gracePeriod=30 Apr 16 17:57:28.107490 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.107462 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:57:28.239617 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239539 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-dshm\") pod \"533ced57-d063-40d3-9f84-47da5c390798\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " Apr 16 17:57:28.239617 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-home\") pod \"533ced57-d063-40d3-9f84-47da5c390798\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " Apr 16 17:57:28.239835 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239644 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-kserve-provision-location\") pod \"533ced57-d063-40d3-9f84-47da5c390798\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " Apr 16 17:57:28.239835 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239670 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwkpm\" (UniqueName: \"kubernetes.io/projected/533ced57-d063-40d3-9f84-47da5c390798-kube-api-access-xwkpm\") pod \"533ced57-d063-40d3-9f84-47da5c390798\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " Apr 16 17:57:28.239835 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239705 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-model-cache\") pod \"533ced57-d063-40d3-9f84-47da5c390798\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " Apr 16 17:57:28.239835 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239734 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533ced57-d063-40d3-9f84-47da5c390798-tls-certs\") pod \"533ced57-d063-40d3-9f84-47da5c390798\" (UID: \"533ced57-d063-40d3-9f84-47da5c390798\") " Apr 16 17:57:28.240081 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.239822 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-home" (OuterVolumeSpecName: "home") pod "533ced57-d063-40d3-9f84-47da5c390798" (UID: "533ced57-d063-40d3-9f84-47da5c390798"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:28.240081 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.240009 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-model-cache" (OuterVolumeSpecName: "model-cache") pod "533ced57-d063-40d3-9f84-47da5c390798" (UID: "533ced57-d063-40d3-9f84-47da5c390798"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:28.240081 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.240032 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-home\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:28.241899 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.241870 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-dshm" (OuterVolumeSpecName: "dshm") pod "533ced57-d063-40d3-9f84-47da5c390798" (UID: "533ced57-d063-40d3-9f84-47da5c390798"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:28.242099 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.242014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533ced57-d063-40d3-9f84-47da5c390798-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "533ced57-d063-40d3-9f84-47da5c390798" (UID: "533ced57-d063-40d3-9f84-47da5c390798"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:57:28.242234 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.242201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533ced57-d063-40d3-9f84-47da5c390798-kube-api-access-xwkpm" (OuterVolumeSpecName: "kube-api-access-xwkpm") pod "533ced57-d063-40d3-9f84-47da5c390798" (UID: "533ced57-d063-40d3-9f84-47da5c390798"). InnerVolumeSpecName "kube-api-access-xwkpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:57:28.270552 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.270518 2576 generic.go:358] "Generic (PLEG): container finished" podID="533ced57-d063-40d3-9f84-47da5c390798" containerID="d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a" exitCode=0 Apr 16 17:57:28.270727 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.270596 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" Apr 16 17:57:28.270727 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.270602 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" event={"ID":"533ced57-d063-40d3-9f84-47da5c390798","Type":"ContainerDied","Data":"d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a"} Apr 16 17:57:28.270727 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.270644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc" event={"ID":"533ced57-d063-40d3-9f84-47da5c390798","Type":"ContainerDied","Data":"11e5f54652f0377458cb680a2ed700a46ead7b65c9b6dccb1a44ec0ccf700e48"} Apr 16 17:57:28.270727 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.270664 2576 scope.go:117] "RemoveContainer" containerID="d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a" Apr 16 17:57:28.294052 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.294031 2576 scope.go:117] "RemoveContainer" containerID="797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f" Apr 16 17:57:28.296642 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.296603 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "533ced57-d063-40d3-9f84-47da5c390798" (UID: "533ced57-d063-40d3-9f84-47da5c390798"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:28.340823 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.340787 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:28.341001 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.340837 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwkpm\" (UniqueName: \"kubernetes.io/projected/533ced57-d063-40d3-9f84-47da5c390798-kube-api-access-xwkpm\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:28.341001 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.340848 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-model-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:28.341001 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.340858 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533ced57-d063-40d3-9f84-47da5c390798-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:28.341001 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.340866 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/533ced57-d063-40d3-9f84-47da5c390798-dshm\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:28.397846 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.397826 2576 scope.go:117] "RemoveContainer" containerID="d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a" Apr 16 17:57:28.398190 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:57:28.398168 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a\": container with ID starting with d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a not found: ID does not exist" containerID="d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a" Apr 16 17:57:28.398265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.398205 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a"} err="failed to get container status \"d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a\": rpc error: code = NotFound desc = could not find container \"d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a\": container with ID starting with d6b46efacf55e805e4a43a824fb30df30af3a65b53884e4d972ecf072bd7bf2a not found: ID does not exist" Apr 16 17:57:28.398265 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.398227 2576 scope.go:117] "RemoveContainer" containerID="797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f" Apr 16 17:57:28.398540 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:57:28.398515 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f\": container with ID starting with 797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f not found: ID does not exist" containerID="797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f" Apr 16 17:57:28.398581 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.398551 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f"} err="failed to get container status \"797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f\": rpc error: code = NotFound desc = could not find container \"797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f\": container with ID starting with 797e12eb47bba5f586175fc43eff3439d906caa8d35f5ee2dad775aca4bbde6f not found: ID does not exist" Apr 16 17:57:28.609568 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.609536 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc"] Apr 16 17:57:28.611772 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.611745 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5fcb498446ddszc"] Apr 16 17:57:28.627678 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:28.627646 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="533ced57-d063-40d3-9f84-47da5c390798" path="/var/lib/kubelet/pods/533ced57-d063-40d3-9f84-47da5c390798/volumes" Apr 16 17:57:29.277747 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:29.277708 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerStarted","Data":"a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219"} Apr 16 17:57:29.278217 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:29.277953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:29.280664 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:29.280636 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:29.321651 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:29.321601 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podStartSLOduration=3.016349201 podStartE2EDuration="34.321585294s" podCreationTimestamp="2026-04-16 17:56:55 +0000 UTC" firstStartedPulling="2026-04-16 17:56:57.119969157 +0000 UTC m=+1015.136436661" lastFinishedPulling="2026-04-16 17:57:28.425205252 +0000 UTC m=+1046.441672754" observedRunningTime="2026-04-16 17:57:29.320345641 +0000 UTC m=+1047.336813162" watchObservedRunningTime="2026-04-16 17:57:29.321585294 +0000 UTC m=+1047.338052826" Apr 16 17:57:30.285597 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:30.285563 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:35.469974 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:35.469928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:35.469974 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:35.469981 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:35.470507 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:35.470263 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.46:8082/healthz\": dial tcp 10.133.0.46:8082: connect: connection refused" Apr 16 17:57:35.471524 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:35.471499 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:45.471424 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:45.471388 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:45.471807 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:45.471697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:45.473142 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:45.473095 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:45.473365 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:45.473158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:46.348650 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:46.348611 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:56.348956 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:56.348835 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:56.467818 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:56.467780 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm"] Apr 16 17:57:56.468151 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:56.468122 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" containerID="cri-o://df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db" gracePeriod=30 Apr 16 17:57:56.468217 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:56.468166 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="tokenizer" containerID="cri-o://a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219" gracePeriod=30 Apr 16 17:57:56.469531 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:56.469497 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 17:57:57.392085 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:57.392006 2576 generic.go:358] "Generic (PLEG): container finished" podID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerID="df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db" exitCode=0 Apr 16 17:57:57.392085 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:57.392074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerDied","Data":"df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db"} Apr 16 17:57:57.844800 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:57.844776 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:58.012820 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.012720 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tls-certs\") pod \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " Apr 16 17:57:58.012820 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.012788 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pfhg\" (UniqueName: \"kubernetes.io/projected/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kube-api-access-8pfhg\") pod \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " Apr 16 17:57:58.012820 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.012813 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kserve-provision-location\") pod \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " Apr 16 17:57:58.013177 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.012831 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-uds\") pod \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " Apr 16 17:57:58.013177 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.012847 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-tmp\") pod \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " Apr 16 17:57:58.013177 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.012944 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-cache\") pod \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\" (UID: \"6467e6a9-5007-4d46-9ca1-cec8a6b474df\") " Apr 16 17:57:58.013177 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.013148 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6467e6a9-5007-4d46-9ca1-cec8a6b474df" (UID: "6467e6a9-5007-4d46-9ca1-cec8a6b474df"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:58.013419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.013268 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6467e6a9-5007-4d46-9ca1-cec8a6b474df" (UID: "6467e6a9-5007-4d46-9ca1-cec8a6b474df"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:58.013419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.013281 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6467e6a9-5007-4d46-9ca1-cec8a6b474df" (UID: "6467e6a9-5007-4d46-9ca1-cec8a6b474df"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:58.013703 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.013680 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6467e6a9-5007-4d46-9ca1-cec8a6b474df" (UID: "6467e6a9-5007-4d46-9ca1-cec8a6b474df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:58.015013 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.014983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kube-api-access-8pfhg" (OuterVolumeSpecName: "kube-api-access-8pfhg") pod "6467e6a9-5007-4d46-9ca1-cec8a6b474df" (UID: "6467e6a9-5007-4d46-9ca1-cec8a6b474df"). InnerVolumeSpecName "kube-api-access-8pfhg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:57:58.015113 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.015047 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6467e6a9-5007-4d46-9ca1-cec8a6b474df" (UID: "6467e6a9-5007-4d46-9ca1-cec8a6b474df"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:57:58.114105 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.114060 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.114105 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.114094 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.114105 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.114106 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pfhg\" (UniqueName: \"kubernetes.io/projected/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kube-api-access-8pfhg\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.114336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.114116 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.114336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.114126 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-uds\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.114336 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.114135 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6467e6a9-5007-4d46-9ca1-cec8a6b474df-tokenizer-tmp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.397993 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.397962 2576 generic.go:358] "Generic (PLEG): container finished" podID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerID="a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219" exitCode=0 Apr 16 17:57:58.398419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.398032 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" Apr 16 17:57:58.398419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.398047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerDied","Data":"a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219"} Apr 16 17:57:58.398419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.398089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm" event={"ID":"6467e6a9-5007-4d46-9ca1-cec8a6b474df","Type":"ContainerDied","Data":"12b0faba289f64f36fc57bdf53814a0ce866338d57dc3551fa063c5ef5b7c302"} Apr 16 17:57:58.398419 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.398104 2576 scope.go:117] "RemoveContainer" containerID="a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219" Apr 16 17:57:58.407443 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.407428 2576 scope.go:117] "RemoveContainer" containerID="df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db" Apr 16 17:57:58.415423 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.415401 2576 scope.go:117] "RemoveContainer" containerID="0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8" Apr 16 17:57:58.422878 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.422859 2576 scope.go:117] "RemoveContainer" containerID="a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219" Apr 16 17:57:58.423220 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:57:58.423203 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219\": container with ID starting with a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219 not found: ID does not exist" containerID="a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219" Apr 16 17:57:58.423267 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.423230 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219"} err="failed to get container status \"a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219\": rpc error: code = NotFound desc = could not find container \"a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219\": container with ID starting with a97bd611f3455c10e18b1a98d269f3c7d56d3be5e86255c575e7a40645cbb219 not found: ID does not exist" Apr 16 17:57:58.423267 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.423249 2576 scope.go:117] "RemoveContainer" containerID="df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db" Apr 16 17:57:58.423528 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:57:58.423507 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db\": container with ID starting with df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db not found: ID does not exist" containerID="df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db" Apr 16 17:57:58.423567 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.423538 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db"} err="failed to get container status \"df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db\": rpc error: code = NotFound desc = could not find container \"df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db\": container with ID starting with df9079c9857f626c2673617f60d1ed35ade40603b7aa8aac3c286779b1cc96db not found: ID does not exist" Apr 16 17:57:58.423567 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.423555 2576 scope.go:117] "RemoveContainer" containerID="0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8" Apr 16 17:57:58.423789 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:57:58.423770 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8\": container with ID starting with 0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8 not found: ID does not exist" containerID="0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8" Apr 16 17:57:58.423841 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.423794 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8"} err="failed to get container status \"0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8\": rpc error: code = NotFound desc = could not find container \"0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8\": container with ID starting with 0d0ce7f7986badf8990df4019dccb7d68bf89bb22a5a65fc3a2f8732334a78c8 not found: ID does not exist" Apr 16 17:57:58.429386 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.429363 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm"] Apr 16 17:57:58.436669 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.436646 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d8svpjm"] Apr 16 17:57:58.627620 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:57:58.627586 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" path="/var/lib/kubelet/pods/6467e6a9-5007-4d46-9ca1-cec8a6b474df/volumes" Apr 16 17:58:16.538776 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.538738 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x"] Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539093 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="533ced57-d063-40d3-9f84-47da5c390798" containerName="storage-initializer" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539107 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="533ced57-d063-40d3-9f84-47da5c390798" containerName="storage-initializer" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539117 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="storage-initializer" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539123 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="storage-initializer" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539140 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="tokenizer" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539147 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="tokenizer" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539156 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539161 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539171 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="533ced57-d063-40d3-9f84-47da5c390798" containerName="main" Apr 16 17:58:16.539212 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539186 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="533ced57-d063-40d3-9f84-47da5c390798" containerName="main" Apr 16 17:58:16.539550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539261 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="tokenizer" Apr 16 17:58:16.539550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539272 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="533ced57-d063-40d3-9f84-47da5c390798" containerName="main" Apr 16 17:58:16.539550 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.539279 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6467e6a9-5007-4d46-9ca1-cec8a6b474df" containerName="main" Apr 16 17:58:16.552073 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.552046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.557239 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.557213 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 17:58:16.557239 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.557238 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 17:58:16.564557 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.564533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x"] Apr 16 17:58:16.649455 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.649422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjk2v\" (UniqueName: \"kubernetes.io/projected/9f2fdd97-7f04-485e-b8bf-50803627faff-kube-api-access-cjk2v\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.649633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.649474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-home\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.649633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.649503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-model-cache\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.649633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.649523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.649633 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.649584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2fdd97-7f04-485e-b8bf-50803627faff-tls-certs\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.649784 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.649634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-dshm\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750197 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-dshm\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjk2v\" (UniqueName: \"kubernetes.io/projected/9f2fdd97-7f04-485e-b8bf-50803627faff-kube-api-access-cjk2v\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-home\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-model-cache\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750401 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2fdd97-7f04-485e-b8bf-50803627faff-tls-certs\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.750852 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-model-cache\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.751014 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.750856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.751103 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.751057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-home\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.752538 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.752516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-dshm\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.752782 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.752765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2fdd97-7f04-485e-b8bf-50803627faff-tls-certs\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.765762 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.765737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjk2v\" (UniqueName: \"kubernetes.io/projected/9f2fdd97-7f04-485e-b8bf-50803627faff-kube-api-access-cjk2v\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-tmr2x\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:16.863840 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:16.863754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:17.034553 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:17.034514 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x"] Apr 16 17:58:17.039947 ip-10-0-141-32 kubenswrapper[2576]: W0416 17:58:17.039894 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f2fdd97_7f04_485e_b8bf_50803627faff.slice/crio-71ffac791d0baa9d4d681d933964e66334827e839fa58158339f83a870bd46f3 WatchSource:0}: Error finding container 71ffac791d0baa9d4d681d933964e66334827e839fa58158339f83a870bd46f3: Status 404 returned error can't find the container with id 71ffac791d0baa9d4d681d933964e66334827e839fa58158339f83a870bd46f3 Apr 16 17:58:17.041869 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:17.041848 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:58:17.475214 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:17.475175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" event={"ID":"9f2fdd97-7f04-485e-b8bf-50803627faff","Type":"ContainerStarted","Data":"d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d"} Apr 16 17:58:17.475214 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:17.475211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" event={"ID":"9f2fdd97-7f04-485e-b8bf-50803627faff","Type":"ContainerStarted","Data":"71ffac791d0baa9d4d681d933964e66334827e839fa58158339f83a870bd46f3"} Apr 16 17:58:21.505168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:21.505075 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerID="d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d" exitCode=0 Apr 16 17:58:21.505168 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:21.505152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" event={"ID":"9f2fdd97-7f04-485e-b8bf-50803627faff","Type":"ContainerDied","Data":"d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d"} Apr 16 17:58:22.510601 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:22.510567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" event={"ID":"9f2fdd97-7f04-485e-b8bf-50803627faff","Type":"ContainerStarted","Data":"d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d"} Apr 16 17:58:22.546025 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:22.545977 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" podStartSLOduration=6.545962705 podStartE2EDuration="6.545962705s" podCreationTimestamp="2026-04-16 17:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:58:22.545059597 +0000 UTC m=+1100.561527117" watchObservedRunningTime="2026-04-16 17:58:22.545962705 +0000 UTC m=+1100.562430265" Apr 16 17:58:26.864205 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:26.864150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:26.864717 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:26.864303 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:26.877247 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:26.877216 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:27.540207 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:27.540176 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:52.099165 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.099104 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x"] Apr 16 17:58:52.099711 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.099583 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerName="main" containerID="cri-o://d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d" gracePeriod=30 Apr 16 17:58:52.372478 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.372452 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:52.460735 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.460706 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-home\") pod \"9f2fdd97-7f04-485e-b8bf-50803627faff\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " Apr 16 17:58:52.460918 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.460785 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2fdd97-7f04-485e-b8bf-50803627faff-tls-certs\") pod \"9f2fdd97-7f04-485e-b8bf-50803627faff\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " Apr 16 17:58:52.460918 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.460825 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjk2v\" (UniqueName: \"kubernetes.io/projected/9f2fdd97-7f04-485e-b8bf-50803627faff-kube-api-access-cjk2v\") pod \"9f2fdd97-7f04-485e-b8bf-50803627faff\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " Apr 16 17:58:52.460918 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.460878 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-dshm\") pod \"9f2fdd97-7f04-485e-b8bf-50803627faff\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " Apr 16 17:58:52.461074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.460940 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-kserve-provision-location\") pod \"9f2fdd97-7f04-485e-b8bf-50803627faff\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " Apr 16 17:58:52.461074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.460968 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-model-cache\") pod \"9f2fdd97-7f04-485e-b8bf-50803627faff\" (UID: \"9f2fdd97-7f04-485e-b8bf-50803627faff\") " Apr 16 17:58:52.461074 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.461042 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-home" (OuterVolumeSpecName: "home") pod "9f2fdd97-7f04-485e-b8bf-50803627faff" (UID: "9f2fdd97-7f04-485e-b8bf-50803627faff"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.461385 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.461231 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-home\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.461385 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.461246 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-model-cache" (OuterVolumeSpecName: "model-cache") pod "9f2fdd97-7f04-485e-b8bf-50803627faff" (UID: "9f2fdd97-7f04-485e-b8bf-50803627faff"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.463016 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.462983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2fdd97-7f04-485e-b8bf-50803627faff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9f2fdd97-7f04-485e-b8bf-50803627faff" (UID: "9f2fdd97-7f04-485e-b8bf-50803627faff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:58:52.463430 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.463402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-dshm" (OuterVolumeSpecName: "dshm") pod "9f2fdd97-7f04-485e-b8bf-50803627faff" (UID: "9f2fdd97-7f04-485e-b8bf-50803627faff"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.463543 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.463522 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2fdd97-7f04-485e-b8bf-50803627faff-kube-api-access-cjk2v" (OuterVolumeSpecName: "kube-api-access-cjk2v") pod "9f2fdd97-7f04-485e-b8bf-50803627faff" (UID: "9f2fdd97-7f04-485e-b8bf-50803627faff"). InnerVolumeSpecName "kube-api-access-cjk2v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:58:52.519561 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.519513 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f2fdd97-7f04-485e-b8bf-50803627faff" (UID: "9f2fdd97-7f04-485e-b8bf-50803627faff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.562213 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.562176 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2fdd97-7f04-485e-b8bf-50803627faff-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.562213 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.562206 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cjk2v\" (UniqueName: \"kubernetes.io/projected/9f2fdd97-7f04-485e-b8bf-50803627faff-kube-api-access-cjk2v\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.562213 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.562217 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-dshm\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.562479 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.562230 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.562479 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.562239 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f2fdd97-7f04-485e-b8bf-50803627faff-model-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.629355 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.629268 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerID="d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d" exitCode=0 Apr 16 17:58:52.629355 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.629306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" event={"ID":"9f2fdd97-7f04-485e-b8bf-50803627faff","Type":"ContainerDied","Data":"d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d"} Apr 16 17:58:52.629355 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.629345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" event={"ID":"9f2fdd97-7f04-485e-b8bf-50803627faff","Type":"ContainerDied","Data":"71ffac791d0baa9d4d681d933964e66334827e839fa58158339f83a870bd46f3"} Apr 16 17:58:52.629667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.629355 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x" Apr 16 17:58:52.629667 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.629363 2576 scope.go:117] "RemoveContainer" containerID="d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d" Apr 16 17:58:52.641468 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.641444 2576 scope.go:117] "RemoveContainer" containerID="d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d" Apr 16 17:58:52.656471 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.656446 2576 scope.go:117] "RemoveContainer" containerID="d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d" Apr 16 17:58:52.656798 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:58:52.656774 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d\": container with ID starting with d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d not found: ID does not exist" containerID="d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d" Apr 16 17:58:52.656887 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.656812 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d"} err="failed to get container status \"d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d\": rpc error: code = NotFound desc = could not find container \"d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d\": container with ID starting with d25638b537780710d60d352609a27c76d603caa59b986df20786e05710b9c62d not found: ID does not exist" Apr 16 17:58:52.656887 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.656841 2576 scope.go:117] "RemoveContainer" containerID="d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d" Apr 16 17:58:52.657185 ip-10-0-141-32 kubenswrapper[2576]: E0416 17:58:52.657167 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d\": container with ID starting with d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d not found: ID does not exist" containerID="d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d" Apr 16 17:58:52.657243 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.657194 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d"} err="failed to get container status \"d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d\": rpc error: code = NotFound desc = could not find container \"d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d\": container with ID starting with d49acff81efcd26a7622d8e2010189cf34c5a1cbbd0814ff7c622188964abe8d not found: ID does not exist" Apr 16 17:58:52.688029 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.687984 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x"] Apr 16 17:58:52.691042 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:52.691018 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-tmr2x"] Apr 16 17:58:54.628432 ip-10-0-141-32 kubenswrapper[2576]: I0416 17:58:54.628398 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" path="/var/lib/kubelet/pods/9f2fdd97-7f04-485e-b8bf-50803627faff/volumes" Apr 16 18:00:02.574926 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:02.574825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:00:02.579439 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:02.579408 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:00:13.136901 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.136867 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps"] Apr 16 18:00:13.137365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.137259 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerName="main" Apr 16 18:00:13.137365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.137271 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerName="main" Apr 16 18:00:13.137365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.137281 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerName="storage-initializer" Apr 16 18:00:13.137365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.137287 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerName="storage-initializer" Apr 16 18:00:13.137365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.137333 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f2fdd97-7f04-485e-b8bf-50803627faff" containerName="main" Apr 16 18:00:13.140561 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.140541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.144242 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.144223 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 18:00:13.144242 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.144237 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:00:13.144375 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.144224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-2b5nc\"" Apr 16 18:00:13.153150 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.153084 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps"] Apr 16 18:00:13.160735 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.160701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.160881 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.160761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.160881 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.160796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.160881 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.160846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.161042 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.160899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.161042 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.161024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9wfn\" (UniqueName: \"kubernetes.io/projected/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kube-api-access-j9wfn\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.261985 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.261948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.261985 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.261993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262247 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262247 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262247 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262247 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9wfn\" (UniqueName: \"kubernetes.io/projected/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kube-api-access-j9wfn\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262465 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262465 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262546 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.262546 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.262512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.264471 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.264448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.272966 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.272936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9wfn\" (UniqueName: \"kubernetes.io/projected/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kube-api-access-j9wfn\") pod \"custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.450233 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.450191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:13.600091 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.600061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps"] Apr 16 18:00:13.601819 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:00:13.601782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9f60d3_d817_4a9e_832c_84a73a0145dc.slice/crio-4b17bf6c140085d164efda35ab328c2fc1d18563641a44dc4a119b454829d5e6 WatchSource:0}: Error finding container 4b17bf6c140085d164efda35ab328c2fc1d18563641a44dc4a119b454829d5e6: Status 404 returned error can't find the container with id 4b17bf6c140085d164efda35ab328c2fc1d18563641a44dc4a119b454829d5e6 Apr 16 18:00:13.944946 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.944883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerStarted","Data":"ab87f5c6cade058545685d1f06362017f68484368d329245db66d1c0c41e7a8c"} Apr 16 18:00:13.945156 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:13.944954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerStarted","Data":"4b17bf6c140085d164efda35ab328c2fc1d18563641a44dc4a119b454829d5e6"} Apr 16 18:00:14.950315 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:14.950276 2576 generic.go:358] "Generic (PLEG): container finished" podID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerID="ab87f5c6cade058545685d1f06362017f68484368d329245db66d1c0c41e7a8c" exitCode=0 Apr 16 18:00:14.950810 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:14.950375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerDied","Data":"ab87f5c6cade058545685d1f06362017f68484368d329245db66d1c0c41e7a8c"} Apr 16 18:00:15.955950 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:15.955885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerStarted","Data":"4e407c3988e3e54833c1634eee9526518c4c0487a11185b251147d0e8a35b23c"} Apr 16 18:00:15.955950 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:15.955953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerStarted","Data":"bc7f73668b9bd36a16464f5665af75164a0f25a221880ff319ba1503aea2463c"} Apr 16 18:00:15.956365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:15.956047 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:15.986091 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:15.986040 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" podStartSLOduration=2.986023156 podStartE2EDuration="2.986023156s" podCreationTimestamp="2026-04-16 18:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:00:15.981694466 +0000 UTC m=+1213.998162010" watchObservedRunningTime="2026-04-16 18:00:15.986023156 +0000 UTC m=+1214.002490677" Apr 16 18:00:23.451347 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:23.451305 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:23.451347 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:23.451350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:23.454079 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:23.454054 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:23.988365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:23.988338 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:00:54.991766 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:00:54.991732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:02:10.091089 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:10.091051 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps"] Apr 16 18:02:10.091537 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:10.091362 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="main" containerID="cri-o://bc7f73668b9bd36a16464f5665af75164a0f25a221880ff319ba1503aea2463c" gracePeriod=30 Apr 16 18:02:10.091537 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:10.091406 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="tokenizer" containerID="cri-o://4e407c3988e3e54833c1634eee9526518c4c0487a11185b251147d0e8a35b23c" gracePeriod=30 Apr 16 18:02:10.399145 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:10.399112 2576 generic.go:358] "Generic (PLEG): container finished" podID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerID="bc7f73668b9bd36a16464f5665af75164a0f25a221880ff319ba1503aea2463c" exitCode=0 Apr 16 18:02:10.399340 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:10.399185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerDied","Data":"bc7f73668b9bd36a16464f5665af75164a0f25a221880ff319ba1503aea2463c"} Apr 16 18:02:11.405283 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.405243 2576 generic.go:358] "Generic (PLEG): container finished" podID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerID="4e407c3988e3e54833c1634eee9526518c4c0487a11185b251147d0e8a35b23c" exitCode=0 Apr 16 18:02:11.405665 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.405312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerDied","Data":"4e407c3988e3e54833c1634eee9526518c4c0487a11185b251147d0e8a35b23c"} Apr 16 18:02:11.450108 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.450075 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:02:11.540144 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540056 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kserve-provision-location\") pod \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " Apr 16 18:02:11.540144 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540098 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-uds\") pod \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " Apr 16 18:02:11.540384 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540157 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-tmp\") pod \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " Apr 16 18:02:11.540384 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540175 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-cache\") pod \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " Apr 16 18:02:11.540384 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540197 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tls-certs\") pod \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " Apr 16 18:02:11.540384 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540232 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9wfn\" (UniqueName: \"kubernetes.io/projected/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kube-api-access-j9wfn\") pod \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\" (UID: \"0e9f60d3-d817-4a9e-832c-84a73a0145dc\") " Apr 16 18:02:11.540602 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0e9f60d3-d817-4a9e-832c-84a73a0145dc" (UID: "0e9f60d3-d817-4a9e-832c-84a73a0145dc"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:11.540602 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540478 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-uds\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:02:11.540602 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540473 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0e9f60d3-d817-4a9e-832c-84a73a0145dc" (UID: "0e9f60d3-d817-4a9e-832c-84a73a0145dc"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:11.540602 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0e9f60d3-d817-4a9e-832c-84a73a0145dc" (UID: "0e9f60d3-d817-4a9e-832c-84a73a0145dc"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:11.540826 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.540800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e9f60d3-d817-4a9e-832c-84a73a0145dc" (UID: "0e9f60d3-d817-4a9e-832c-84a73a0145dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:11.542430 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.542409 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kube-api-access-j9wfn" (OuterVolumeSpecName: "kube-api-access-j9wfn") pod "0e9f60d3-d817-4a9e-832c-84a73a0145dc" (UID: "0e9f60d3-d817-4a9e-832c-84a73a0145dc"). InnerVolumeSpecName "kube-api-access-j9wfn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:02:11.542601 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.542411 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0e9f60d3-d817-4a9e-832c-84a73a0145dc" (UID: "0e9f60d3-d817-4a9e-832c-84a73a0145dc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:02:11.641407 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.641362 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-tmp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:02:11.641407 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.641398 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tokenizer-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:02:11.641407 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.641412 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9f60d3-d817-4a9e-832c-84a73a0145dc-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:02:11.641671 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.641425 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j9wfn\" (UniqueName: \"kubernetes.io/projected/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kube-api-access-j9wfn\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:02:11.641671 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:11.641438 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9f60d3-d817-4a9e-832c-84a73a0145dc-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:02:12.411394 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.411358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" event={"ID":"0e9f60d3-d817-4a9e-832c-84a73a0145dc","Type":"ContainerDied","Data":"4b17bf6c140085d164efda35ab328c2fc1d18563641a44dc4a119b454829d5e6"} Apr 16 18:02:12.411799 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.411406 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps" Apr 16 18:02:12.411799 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.411414 2576 scope.go:117] "RemoveContainer" containerID="4e407c3988e3e54833c1634eee9526518c4c0487a11185b251147d0e8a35b23c" Apr 16 18:02:12.420428 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.420408 2576 scope.go:117] "RemoveContainer" containerID="bc7f73668b9bd36a16464f5665af75164a0f25a221880ff319ba1503aea2463c" Apr 16 18:02:12.428160 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.428139 2576 scope.go:117] "RemoveContainer" containerID="ab87f5c6cade058545685d1f06362017f68484368d329245db66d1c0c41e7a8c" Apr 16 18:02:12.438697 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.438670 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps"] Apr 16 18:02:12.445436 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.445413 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6df4bdc72w2ps"] Apr 16 18:02:12.627093 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:12.627061 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" path="/var/lib/kubelet/pods/0e9f60d3-d817-4a9e-832c-84a73a0145dc/volumes" Apr 16 18:02:30.504145 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504062 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7"] Apr 16 18:02:30.504514 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504431 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="tokenizer" Apr 16 18:02:30.504514 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504443 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="tokenizer" Apr 16 18:02:30.504514 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504455 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="main" Apr 16 18:02:30.504514 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504460 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="main" Apr 16 18:02:30.504514 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504468 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="storage-initializer" Apr 16 18:02:30.504514 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504474 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="storage-initializer" Apr 16 18:02:30.504706 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504557 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="tokenizer" Apr 16 18:02:30.504706 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.504567 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e9f60d3-d817-4a9e-832c-84a73a0145dc" containerName="main" Apr 16 18:02:30.508091 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.508067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.511994 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.511971 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-ghdqn\"" Apr 16 18:02:30.512121 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.511972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:02:30.512121 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.511977 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 18:02:30.521653 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.521621 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7"] Apr 16 18:02:30.614272 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.614235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.614272 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.614280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.614515 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.614306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjbsj\" (UniqueName: \"kubernetes.io/projected/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kube-api-access-bjbsj\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.614515 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.614372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.614515 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.614389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.614515 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.614480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715064 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715064 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjbsj\" (UniqueName: \"kubernetes.io/projected/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kube-api-access-bjbsj\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715285 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715285 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715357 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715401 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715453 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715515 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715570 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.715741 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.715720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.717722 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.717703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.725165 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.725136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjbsj\" (UniqueName: \"kubernetes.io/projected/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kube-api-access-bjbsj\") pod \"router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.817167 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.817080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:30.961735 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:30.961707 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7"] Apr 16 18:02:30.963161 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:02:30.963122 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e5c48f_d6f3_4ba5_be05_0ef67e410d2d.slice/crio-368fbc6bb3fcd369e5b9c2c63cd0fd11775b6d8773683cb384ee2f7b3888fab6 WatchSource:0}: Error finding container 368fbc6bb3fcd369e5b9c2c63cd0fd11775b6d8773683cb384ee2f7b3888fab6: Status 404 returned error can't find the container with id 368fbc6bb3fcd369e5b9c2c63cd0fd11775b6d8773683cb384ee2f7b3888fab6 Apr 16 18:02:31.486957 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:31.486893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerStarted","Data":"3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5"} Apr 16 18:02:31.486957 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:31.486959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerStarted","Data":"368fbc6bb3fcd369e5b9c2c63cd0fd11775b6d8773683cb384ee2f7b3888fab6"} Apr 16 18:02:32.495651 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:32.495617 2576 generic.go:358] "Generic (PLEG): container finished" podID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerID="3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5" exitCode=0 Apr 16 18:02:32.496064 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:32.495701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerDied","Data":"3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5"} Apr 16 18:02:33.501796 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:33.501749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerStarted","Data":"2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111"} Apr 16 18:02:33.501796 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:33.501794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerStarted","Data":"198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41"} Apr 16 18:02:33.502344 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:33.501857 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:33.536588 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:33.536527 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" podStartSLOduration=3.5365093119999997 podStartE2EDuration="3.536509312s" podCreationTimestamp="2026-04-16 18:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:33.535245914 +0000 UTC m=+1351.551713434" watchObservedRunningTime="2026-04-16 18:02:33.536509312 +0000 UTC m=+1351.552976876" Apr 16 18:02:40.817815 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:40.817775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:40.818371 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:40.817957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:40.820665 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:40.820641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:02:41.535893 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:02:41.535851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:03:03.544740 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:03.544710 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:03:51.025633 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.025594 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5644d5958-zwnc7"] Apr 16 18:03:51.026291 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.025847 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" podUID="33f61e9d-3dfb-429d-a797-917fb922e521" containerName="manager" containerID="cri-o://2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac" gracePeriod=30 Apr 16 18:03:51.276369 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.276310 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 18:03:51.311569 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.311532 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwxjx\" (UniqueName: \"kubernetes.io/projected/33f61e9d-3dfb-429d-a797-917fb922e521-kube-api-access-pwxjx\") pod \"33f61e9d-3dfb-429d-a797-917fb922e521\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " Apr 16 18:03:51.311732 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.311662 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f61e9d-3dfb-429d-a797-917fb922e521-cert\") pod \"33f61e9d-3dfb-429d-a797-917fb922e521\" (UID: \"33f61e9d-3dfb-429d-a797-917fb922e521\") " Apr 16 18:03:51.313762 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.313727 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f61e9d-3dfb-429d-a797-917fb922e521-kube-api-access-pwxjx" (OuterVolumeSpecName: "kube-api-access-pwxjx") pod "33f61e9d-3dfb-429d-a797-917fb922e521" (UID: "33f61e9d-3dfb-429d-a797-917fb922e521"). InnerVolumeSpecName "kube-api-access-pwxjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:03:51.313900 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.313741 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f61e9d-3dfb-429d-a797-917fb922e521-cert" (OuterVolumeSpecName: "cert") pod "33f61e9d-3dfb-429d-a797-917fb922e521" (UID: "33f61e9d-3dfb-429d-a797-917fb922e521"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:03:51.412470 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.412436 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwxjx\" (UniqueName: \"kubernetes.io/projected/33f61e9d-3dfb-429d-a797-917fb922e521-kube-api-access-pwxjx\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:03:51.412470 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.412466 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33f61e9d-3dfb-429d-a797-917fb922e521-cert\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:03:51.806081 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.806044 2576 generic.go:358] "Generic (PLEG): container finished" podID="33f61e9d-3dfb-429d-a797-917fb922e521" containerID="2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac" exitCode=0 Apr 16 18:03:51.806081 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.806085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" event={"ID":"33f61e9d-3dfb-429d-a797-917fb922e521","Type":"ContainerDied","Data":"2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac"} Apr 16 18:03:51.806340 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.806108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" event={"ID":"33f61e9d-3dfb-429d-a797-917fb922e521","Type":"ContainerDied","Data":"9fa7ff118f597110fc4d8ea21b6aee5355a9a136d889a45169d3a5265bbeb079"} Apr 16 18:03:51.806340 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.806111 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5644d5958-zwnc7" Apr 16 18:03:51.806340 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.806127 2576 scope.go:117] "RemoveContainer" containerID="2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac" Apr 16 18:03:51.815146 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.815125 2576 scope.go:117] "RemoveContainer" containerID="2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac" Apr 16 18:03:51.815447 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:03:51.815426 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac\": container with ID starting with 2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac not found: ID does not exist" containerID="2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac" Apr 16 18:03:51.815534 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.815454 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac"} err="failed to get container status \"2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac\": rpc error: code = NotFound desc = could not find container \"2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac\": container with ID starting with 2fd7db84c6ce15ace65802fc4dfc42b67e41fa46693cd2c71550ba873f5b63ac not found: ID does not exist" Apr 16 18:03:51.829570 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.829548 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5644d5958-zwnc7"] Apr 16 18:03:51.833760 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:51.833738 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-5644d5958-zwnc7"] Apr 16 18:03:52.627007 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:03:52.626974 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f61e9d-3dfb-429d-a797-917fb922e521" path="/var/lib/kubelet/pods/33f61e9d-3dfb-429d-a797-917fb922e521/volumes" Apr 16 18:04:18.288966 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:18.288935 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7"] Apr 16 18:04:18.289401 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:18.289201 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="main" containerID="cri-o://198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41" gracePeriod=30 Apr 16 18:04:18.289401 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:18.289284 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="tokenizer" containerID="cri-o://2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111" gracePeriod=30 Apr 16 18:04:18.914654 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:18.914615 2576 generic.go:358] "Generic (PLEG): container finished" podID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerID="198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41" exitCode=0 Apr 16 18:04:18.914851 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:18.914687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerDied","Data":"198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41"} Apr 16 18:04:19.647630 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.647604 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:04:19.767574 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-uds\") pod \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " Apr 16 18:04:19.767574 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767535 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-cache\") pod \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " Apr 16 18:04:19.767574 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767567 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tls-certs\") pod \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " Apr 16 18:04:19.767860 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767648 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjbsj\" (UniqueName: \"kubernetes.io/projected/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kube-api-access-bjbsj\") pod \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " Apr 16 18:04:19.767860 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-tmp\") pod \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " Apr 16 18:04:19.767860 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767733 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kserve-provision-location\") pod \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\" (UID: \"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d\") " Apr 16 18:04:19.767860 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767729 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" (UID: "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:19.767860 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.767778 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" (UID: "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:19.768203 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.768094 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-uds\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:04:19.768203 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.768093 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" (UID: "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:19.768203 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.768116 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:04:19.768484 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.768459 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" (UID: "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:19.769776 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.769753 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" (UID: "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:19.769838 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.769798 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kube-api-access-bjbsj" (OuterVolumeSpecName: "kube-api-access-bjbsj") pod "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" (UID: "92e5c48f-d6f3-4ba5-be05-0ef67e410d2d"). InnerVolumeSpecName "kube-api-access-bjbsj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:19.869382 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.869329 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bjbsj\" (UniqueName: \"kubernetes.io/projected/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kube-api-access-bjbsj\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:04:19.869382 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.869375 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tokenizer-tmp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:04:19.869382 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.869385 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:04:19.869382 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.869397 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:04:19.921000 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.920968 2576 generic.go:358] "Generic (PLEG): container finished" podID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerID="2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111" exitCode=0 Apr 16 18:04:19.921189 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.921056 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" Apr 16 18:04:19.921189 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.921085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerDied","Data":"2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111"} Apr 16 18:04:19.921189 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.921119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7" event={"ID":"92e5c48f-d6f3-4ba5-be05-0ef67e410d2d","Type":"ContainerDied","Data":"368fbc6bb3fcd369e5b9c2c63cd0fd11775b6d8773683cb384ee2f7b3888fab6"} Apr 16 18:04:19.921189 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.921134 2576 scope.go:117] "RemoveContainer" containerID="2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111" Apr 16 18:04:19.930541 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.930523 2576 scope.go:117] "RemoveContainer" containerID="198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41" Apr 16 18:04:19.938235 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.938217 2576 scope.go:117] "RemoveContainer" containerID="3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5" Apr 16 18:04:19.945823 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.945790 2576 scope.go:117] "RemoveContainer" containerID="2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111" Apr 16 18:04:19.946097 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:04:19.946075 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111\": container with ID starting with 2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111 not found: ID does not exist" containerID="2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111" Apr 16 18:04:19.946193 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.946109 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111"} err="failed to get container status \"2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111\": rpc error: code = NotFound desc = could not find container \"2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111\": container with ID starting with 2f4321f43acbf75e709c33c14b111d65a1b6fa22e3b961fe466248c1c7f12111 not found: ID does not exist" Apr 16 18:04:19.946193 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.946132 2576 scope.go:117] "RemoveContainer" containerID="198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41" Apr 16 18:04:19.946358 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:04:19.946339 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41\": container with ID starting with 198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41 not found: ID does not exist" containerID="198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41" Apr 16 18:04:19.946411 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.946365 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41"} err="failed to get container status \"198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41\": rpc error: code = NotFound desc = could not find container \"198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41\": container with ID starting with 198f42a3797c5681c0dc2c6cfd71bcfce9d5c6256b846f543c1816a3fb0fea41 not found: ID does not exist" Apr 16 18:04:19.946411 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.946381 2576 scope.go:117] "RemoveContainer" containerID="3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5" Apr 16 18:04:19.946668 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:04:19.946649 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5\": container with ID starting with 3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5 not found: ID does not exist" containerID="3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5" Apr 16 18:04:19.946744 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.946673 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5"} err="failed to get container status \"3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5\": rpc error: code = NotFound desc = could not find container \"3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5\": container with ID starting with 3531adacc65a6920c70777c97847f6e02844e034e2836b3bb29111635dcbd2d5 not found: ID does not exist" Apr 16 18:04:19.947828 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.947809 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7"] Apr 16 18:04:19.951354 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:19.951332 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d66576fd5-rdjx7"] Apr 16 18:04:20.627518 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:20.627483 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" path="/var/lib/kubelet/pods/92e5c48f-d6f3-4ba5-be05-0ef67e410d2d/volumes" Apr 16 18:04:25.204022 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.203991 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x"] Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204352 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="storage-initializer" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204364 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="storage-initializer" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204372 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33f61e9d-3dfb-429d-a797-917fb922e521" containerName="manager" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204379 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f61e9d-3dfb-429d-a797-917fb922e521" containerName="manager" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204396 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="tokenizer" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204401 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="tokenizer" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204408 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="main" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204413 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="main" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204468 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="main" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204477 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="92e5c48f-d6f3-4ba5-be05-0ef67e410d2d" containerName="tokenizer" Apr 16 18:04:25.204573 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.204484 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="33f61e9d-3dfb-429d-a797-917fb922e521" containerName="manager" Apr 16 18:04:25.209441 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.209413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.213091 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.213067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:04:25.213215 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.213128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-gzs4x\"" Apr 16 18:04:25.214209 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.214189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 18:04:25.229402 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.229382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x"] Apr 16 18:04:25.318639 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.318598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.318639 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.318636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46932405-4651-4064-bb92-f5acbf9e4ac8-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.318885 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.318657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.318885 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.318769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.318885 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.318802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.318885 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.318817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd7h4\" (UniqueName: \"kubernetes.io/projected/46932405-4651-4064-bb92-f5acbf9e4ac8-kube-api-access-pd7h4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.419664 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.419622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.419664 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.419665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd7h4\" (UniqueName: \"kubernetes.io/projected/46932405-4651-4064-bb92-f5acbf9e4ac8-kube-api-access-pd7h4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.419950 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.419726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.419950 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.419753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46932405-4651-4064-bb92-f5acbf9e4ac8-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.419950 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.419778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.419950 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.419861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.420189 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.420125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.420189 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.420173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.420297 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.420196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.420297 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.420245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.422196 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.422176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46932405-4651-4064-bb92-f5acbf9e4ac8-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.431425 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.431399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd7h4\" (UniqueName: \"kubernetes.io/projected/46932405-4651-4064-bb92-f5acbf9e4ac8-kube-api-access-pd7h4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.518748 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.518651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:25.670771 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.670697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x"] Apr 16 18:04:25.673597 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:04:25.673565 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46932405_4651_4064_bb92_f5acbf9e4ac8.slice/crio-f568ee391a0f1d4bd4e275e165a92710169f5c3829261b4e990e5b04928f772d WatchSource:0}: Error finding container f568ee391a0f1d4bd4e275e165a92710169f5c3829261b4e990e5b04928f772d: Status 404 returned error can't find the container with id f568ee391a0f1d4bd4e275e165a92710169f5c3829261b4e990e5b04928f772d Apr 16 18:04:25.675432 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.675415 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:04:25.956365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.956305 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerStarted","Data":"7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875"} Apr 16 18:04:25.956365 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:25.956356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerStarted","Data":"f568ee391a0f1d4bd4e275e165a92710169f5c3829261b4e990e5b04928f772d"} Apr 16 18:04:26.962812 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:26.962776 2576 generic.go:358] "Generic (PLEG): container finished" podID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerID="7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875" exitCode=0 Apr 16 18:04:26.963480 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:26.963453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerDied","Data":"7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875"} Apr 16 18:04:27.971900 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:27.971861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerStarted","Data":"ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8"} Apr 16 18:04:27.972513 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:27.972492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerStarted","Data":"92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb"} Apr 16 18:04:27.972734 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:27.972708 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:27.999418 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:27.999374 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" podStartSLOduration=2.999358932 podStartE2EDuration="2.999358932s" podCreationTimestamp="2026-04-16 18:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:27.998032672 +0000 UTC m=+1466.014500192" watchObservedRunningTime="2026-04-16 18:04:27.999358932 +0000 UTC m=+1466.015826471" Apr 16 18:04:35.519552 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:35.519508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:35.520000 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:35.519567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:35.522452 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:35.522424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:36.007634 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:36.007605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:04:57.013755 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:04:57.013723 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:05:02.609035 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:05:02.609005 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:05:02.615187 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:05:02.615162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:08:04.185862 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:04.185828 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x"] Apr 16 18:08:04.186507 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:04.186158 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="main" containerID="cri-o://92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb" gracePeriod=30 Apr 16 18:08:04.186507 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:04.186198 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="tokenizer" containerID="cri-o://ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8" gracePeriod=30 Apr 16 18:08:04.823526 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:04.823493 2576 generic.go:358] "Generic (PLEG): container finished" podID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerID="92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb" exitCode=0 Apr 16 18:08:04.823717 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:04.823545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerDied","Data":"92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb"} Apr 16 18:08:05.543571 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.543547 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:08:05.659832 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.659796 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46932405-4651-4064-bb92-f5acbf9e4ac8-tls-certs\") pod \"46932405-4651-4064-bb92-f5acbf9e4ac8\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " Apr 16 18:08:05.660056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.659860 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-kserve-provision-location\") pod \"46932405-4651-4064-bb92-f5acbf9e4ac8\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " Apr 16 18:08:05.660056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.659891 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd7h4\" (UniqueName: \"kubernetes.io/projected/46932405-4651-4064-bb92-f5acbf9e4ac8-kube-api-access-pd7h4\") pod \"46932405-4651-4064-bb92-f5acbf9e4ac8\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " Apr 16 18:08:05.660056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.659954 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-cache\") pod \"46932405-4651-4064-bb92-f5acbf9e4ac8\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " Apr 16 18:08:05.660056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.660010 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-tmp\") pod \"46932405-4651-4064-bb92-f5acbf9e4ac8\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " Apr 16 18:08:05.660056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.660033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-uds\") pod \"46932405-4651-4064-bb92-f5acbf9e4ac8\" (UID: \"46932405-4651-4064-bb92-f5acbf9e4ac8\") " Apr 16 18:08:05.660322 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.660259 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "46932405-4651-4064-bb92-f5acbf9e4ac8" (UID: "46932405-4651-4064-bb92-f5acbf9e4ac8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:05.660369 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.660342 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "46932405-4651-4064-bb92-f5acbf9e4ac8" (UID: "46932405-4651-4064-bb92-f5acbf9e4ac8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:05.660405 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.660372 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "46932405-4651-4064-bb92-f5acbf9e4ac8" (UID: "46932405-4651-4064-bb92-f5acbf9e4ac8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:05.660744 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.660721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "46932405-4651-4064-bb92-f5acbf9e4ac8" (UID: "46932405-4651-4064-bb92-f5acbf9e4ac8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:05.662074 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.662049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46932405-4651-4064-bb92-f5acbf9e4ac8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "46932405-4651-4064-bb92-f5acbf9e4ac8" (UID: "46932405-4651-4064-bb92-f5acbf9e4ac8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:08:05.662139 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.662104 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46932405-4651-4064-bb92-f5acbf9e4ac8-kube-api-access-pd7h4" (OuterVolumeSpecName: "kube-api-access-pd7h4") pod "46932405-4651-4064-bb92-f5acbf9e4ac8" (UID: "46932405-4651-4064-bb92-f5acbf9e4ac8"). InnerVolumeSpecName "kube-api-access-pd7h4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:05.760872 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.760834 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-tmp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:08:05.760872 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.760866 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-uds\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:08:05.760872 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.760874 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46932405-4651-4064-bb92-f5acbf9e4ac8-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:08:05.761186 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.760884 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:08:05.761186 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.760896 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd7h4\" (UniqueName: \"kubernetes.io/projected/46932405-4651-4064-bb92-f5acbf9e4ac8-kube-api-access-pd7h4\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:08:05.761186 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.760930 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46932405-4651-4064-bb92-f5acbf9e4ac8-tokenizer-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:08:05.829164 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.829127 2576 generic.go:358] "Generic (PLEG): container finished" podID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerID="ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8" exitCode=0 Apr 16 18:08:05.829363 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.829211 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" Apr 16 18:08:05.829363 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.829209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerDied","Data":"ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8"} Apr 16 18:08:05.829363 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.829315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x" event={"ID":"46932405-4651-4064-bb92-f5acbf9e4ac8","Type":"ContainerDied","Data":"f568ee391a0f1d4bd4e275e165a92710169f5c3829261b4e990e5b04928f772d"} Apr 16 18:08:05.829363 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.829332 2576 scope.go:117] "RemoveContainer" containerID="ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8" Apr 16 18:08:05.838679 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.838661 2576 scope.go:117] "RemoveContainer" containerID="92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb" Apr 16 18:08:05.846779 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.846758 2576 scope.go:117] "RemoveContainer" containerID="7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875" Apr 16 18:08:05.854741 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.854714 2576 scope.go:117] "RemoveContainer" containerID="ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8" Apr 16 18:08:05.855041 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:08:05.855021 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8\": container with ID starting with ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8 not found: ID does not exist" containerID="ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8" Apr 16 18:08:05.855130 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.855073 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8"} err="failed to get container status \"ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8\": rpc error: code = NotFound desc = could not find container \"ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8\": container with ID starting with ea208750d471a8f4a1e8d321c9a63be2597e7774257eb9f721cf2c8b7df01fe8 not found: ID does not exist" Apr 16 18:08:05.855130 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.855095 2576 scope.go:117] "RemoveContainer" containerID="92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb" Apr 16 18:08:05.855322 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:08:05.855296 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb\": container with ID starting with 92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb not found: ID does not exist" containerID="92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb" Apr 16 18:08:05.855366 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.855318 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb"} err="failed to get container status \"92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb\": rpc error: code = NotFound desc = could not find container \"92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb\": container with ID starting with 92cb42749f6d43be10d780d63f4cfe97d5d7d6e32147dde3a8f64759a8d8b1bb not found: ID does not exist" Apr 16 18:08:05.855366 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.855331 2576 scope.go:117] "RemoveContainer" containerID="7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875" Apr 16 18:08:05.855550 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:08:05.855534 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875\": container with ID starting with 7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875 not found: ID does not exist" containerID="7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875" Apr 16 18:08:05.855591 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.855553 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875"} err="failed to get container status \"7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875\": rpc error: code = NotFound desc = could not find container \"7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875\": container with ID starting with 7ad40caa53b4822b1d7bb6a89e3ddf454572783efc3634e3b9badeb535148875 not found: ID does not exist" Apr 16 18:08:05.857495 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.857474 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x"] Apr 16 18:08:05.862356 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:05.862330 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepkr8x"] Apr 16 18:08:06.628015 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:06.627980 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" path="/var/lib/kubelet/pods/46932405-4651-4064-bb92-f5acbf9e4ac8/volumes" Apr 16 18:08:20.621330 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621290 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt"] Apr 16 18:08:20.621829 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621808 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="storage-initializer" Apr 16 18:08:20.621873 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621833 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="storage-initializer" Apr 16 18:08:20.621873 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621858 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="main" Apr 16 18:08:20.621873 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621868 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="main" Apr 16 18:08:20.621998 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621886 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="tokenizer" Apr 16 18:08:20.621998 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.621895 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="tokenizer" Apr 16 18:08:20.622070 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.622020 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="main" Apr 16 18:08:20.622070 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.622035 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="46932405-4651-4064-bb92-f5acbf9e4ac8" containerName="tokenizer" Apr 16 18:08:20.627438 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.627412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.631004 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.630979 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-jzmsg\"" Apr 16 18:08:20.631151 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.631060 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:08:20.631901 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.631880 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 18:08:20.636978 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.636956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt"] Apr 16 18:08:20.688353 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.688316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8nz\" (UniqueName: \"kubernetes.io/projected/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kube-api-access-9b8nz\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.688561 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.688380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.688561 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.688421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.688561 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.688456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.688561 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.688478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.688725 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.688591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.789970 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.789927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790158 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.789982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790158 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790158 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790326 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9b8nz\" (UniqueName: \"kubernetes.io/projected/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kube-api-access-9b8nz\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790326 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790502 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790831 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.790970 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.790898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.791063 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.791041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.793485 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.793459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.826230 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.826198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b8nz\" (UniqueName: \"kubernetes.io/projected/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kube-api-access-9b8nz\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:20.938391 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:20.938354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:21.084158 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:21.084126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt"] Apr 16 18:08:21.086209 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:08:21.086187 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ff8466_1bb0_4c26_83ee_de5225b65d1c.slice/crio-e64dbee600b3bf3e51ad594ff7f1f20645ef3a10d1934a63eed3e981431ad222 WatchSource:0}: Error finding container e64dbee600b3bf3e51ad594ff7f1f20645ef3a10d1934a63eed3e981431ad222: Status 404 returned error can't find the container with id e64dbee600b3bf3e51ad594ff7f1f20645ef3a10d1934a63eed3e981431ad222 Apr 16 18:08:21.891038 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:21.891003 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerID="50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792" exitCode=0 Apr 16 18:08:21.891429 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:21.891091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerDied","Data":"50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792"} Apr 16 18:08:21.891429 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:21.891124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerStarted","Data":"e64dbee600b3bf3e51ad594ff7f1f20645ef3a10d1934a63eed3e981431ad222"} Apr 16 18:08:22.896445 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:22.896406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerStarted","Data":"1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c"} Apr 16 18:08:22.896445 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:22.896450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerStarted","Data":"fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc"} Apr 16 18:08:22.896982 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:22.896605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:22.935941 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:22.935859 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" podStartSLOduration=2.9358379919999997 podStartE2EDuration="2.935837992s" podCreationTimestamp="2026-04-16 18:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:08:22.932377112 +0000 UTC m=+1700.948844633" watchObservedRunningTime="2026-04-16 18:08:22.935837992 +0000 UTC m=+1700.952305515" Apr 16 18:08:30.938805 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:30.938771 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:30.939314 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:30.938957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:30.941383 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:30.941363 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:31.934029 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:31.934003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:08:53.941897 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:08:53.941868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:10:02.638820 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:10:02.638790 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:10:02.645630 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:10:02.645603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:11:35.936430 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:35.936339 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt"] Apr 16 18:11:35.936862 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:35.936645 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="main" containerID="cri-o://fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc" gracePeriod=30 Apr 16 18:11:35.936862 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:35.936705 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="tokenizer" containerID="cri-o://1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c" gracePeriod=30 Apr 16 18:11:35.983977 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:35.983945 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f"] Apr 16 18:11:35.987286 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:35.987260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:35.991554 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:35.991524 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-dbj86\"" Apr 16 18:11:36.001644 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.001611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f"] Apr 16 18:11:36.145377 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145535 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145603 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpgc\" (UniqueName: \"kubernetes.io/projected/cdff9bd6-0a70-4825-8d1e-3488bf21909f-kube-api-access-5tpgc\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145603 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145727 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145727 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145727 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145727 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.145727 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.145724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.246726 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpgc\" (UniqueName: \"kubernetes.io/projected/cdff9bd6-0a70-4825-8d1e-3488bf21909f-kube-api-access-5tpgc\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.246726 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247023 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.246941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247403 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.247125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247403 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.247213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.247861 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.247725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.248000 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.247970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.248131 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.248104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.249853 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.249830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.250036 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.250018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.258688 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.258654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpgc\" (UniqueName: \"kubernetes.io/projected/cdff9bd6-0a70-4825-8d1e-3488bf21909f-kube-api-access-5tpgc\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.261717 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.261685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cdff9bd6-0a70-4825-8d1e-3488bf21909f-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-jdq4f\" (UID: \"cdff9bd6-0a70-4825-8d1e-3488bf21909f\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.304100 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.304060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:36.460982 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:11:36.460944 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdff9bd6_0a70_4825_8d1e_3488bf21909f.slice/crio-50c274a89e75bd68aabb45588811c4ca6e0c930f005c4fd459901749bb40acb6 WatchSource:0}: Error finding container 50c274a89e75bd68aabb45588811c4ca6e0c930f005c4fd459901749bb40acb6: Status 404 returned error can't find the container with id 50c274a89e75bd68aabb45588811c4ca6e0c930f005c4fd459901749bb40acb6 Apr 16 18:11:36.461120 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.460988 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f"] Apr 16 18:11:36.463036 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.463019 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:11:36.463495 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.463467 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:11:36.463558 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.463542 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:11:36.463602 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.463589 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 18:11:36.652788 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.652753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" event={"ID":"cdff9bd6-0a70-4825-8d1e-3488bf21909f","Type":"ContainerStarted","Data":"a239aa353907e14c829424755d9eaeca471d9ba6dfa3f9e009989a57e84977fc"} Apr 16 18:11:36.652788 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.652795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" event={"ID":"cdff9bd6-0a70-4825-8d1e-3488bf21909f","Type":"ContainerStarted","Data":"50c274a89e75bd68aabb45588811c4ca6e0c930f005c4fd459901749bb40acb6"} Apr 16 18:11:36.654666 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.654642 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerID="fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc" exitCode=0 Apr 16 18:11:36.654828 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.654816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerDied","Data":"fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc"} Apr 16 18:11:36.679045 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:36.678980 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" podStartSLOduration=1.678962736 podStartE2EDuration="1.678962736s" podCreationTimestamp="2026-04-16 18:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:36.675755876 +0000 UTC m=+1894.692223399" watchObservedRunningTime="2026-04-16 18:11:36.678962736 +0000 UTC m=+1894.695430256" Apr 16 18:11:37.304582 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.304541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:37.310846 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.310816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:37.401289 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.401263 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:11:37.561486 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561394 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-tmp\") pod \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " Apr 16 18:11:37.561486 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561442 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kserve-provision-location\") pod \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " Apr 16 18:11:37.561486 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tls-certs\") pod \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " Apr 16 18:11:37.561486 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561490 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-uds\") pod \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " Apr 16 18:11:37.561838 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561506 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-cache\") pod \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " Apr 16 18:11:37.561838 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561596 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b8nz\" (UniqueName: \"kubernetes.io/projected/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kube-api-access-9b8nz\") pod \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\" (UID: \"c2ff8466-1bb0-4c26-83ee-de5225b65d1c\") " Apr 16 18:11:37.561838 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561734 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c2ff8466-1bb0-4c26-83ee-de5225b65d1c" (UID: "c2ff8466-1bb0-4c26-83ee-de5225b65d1c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:37.561838 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561805 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c2ff8466-1bb0-4c26-83ee-de5225b65d1c" (UID: "c2ff8466-1bb0-4c26-83ee-de5225b65d1c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:37.561838 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561818 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c2ff8466-1bb0-4c26-83ee-de5225b65d1c" (UID: "c2ff8466-1bb0-4c26-83ee-de5225b65d1c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:37.562090 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561935 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-uds\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:11:37.562090 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561958 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:11:37.562090 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.561973 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tokenizer-tmp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:11:37.562294 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.562266 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2ff8466-1bb0-4c26-83ee-de5225b65d1c" (UID: "c2ff8466-1bb0-4c26-83ee-de5225b65d1c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:37.563801 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.563777 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kube-api-access-9b8nz" (OuterVolumeSpecName: "kube-api-access-9b8nz") pod "c2ff8466-1bb0-4c26-83ee-de5225b65d1c" (UID: "c2ff8466-1bb0-4c26-83ee-de5225b65d1c"). InnerVolumeSpecName "kube-api-access-9b8nz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:37.563947 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.563920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c2ff8466-1bb0-4c26-83ee-de5225b65d1c" (UID: "c2ff8466-1bb0-4c26-83ee-de5225b65d1c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:37.660308 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.660272 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerID="1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c" exitCode=0 Apr 16 18:11:37.660476 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.660347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerDied","Data":"1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c"} Apr 16 18:11:37.660476 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.660373 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" Apr 16 18:11:37.660476 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.660395 2576 scope.go:117] "RemoveContainer" containerID="1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c" Apr 16 18:11:37.660476 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.660382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt" event={"ID":"c2ff8466-1bb0-4c26-83ee-de5225b65d1c","Type":"ContainerDied","Data":"e64dbee600b3bf3e51ad594ff7f1f20645ef3a10d1934a63eed3e981431ad222"} Apr 16 18:11:37.661086 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.661020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:37.662352 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.662333 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jdq4f" Apr 16 18:11:37.662629 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.662603 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9b8nz\" (UniqueName: \"kubernetes.io/projected/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kube-api-access-9b8nz\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:11:37.662714 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.662636 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:11:37.662714 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.662652 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ff8466-1bb0-4c26-83ee-de5225b65d1c-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:11:37.670052 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.669965 2576 scope.go:117] "RemoveContainer" containerID="fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc" Apr 16 18:11:37.680316 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.680295 2576 scope.go:117] "RemoveContainer" containerID="50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792" Apr 16 18:11:37.701953 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.701928 2576 scope.go:117] "RemoveContainer" containerID="1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c" Apr 16 18:11:37.702932 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:11:37.702867 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c\": container with ID starting with 1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c not found: ID does not exist" containerID="1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c" Apr 16 18:11:37.703133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.703073 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c"} err="failed to get container status \"1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c\": rpc error: code = NotFound desc = could not find container \"1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c\": container with ID starting with 1a1fa9e2e9259ccd643b9eabe74f13168615d99c3034caae0422996c2995eb7c not found: ID does not exist" Apr 16 18:11:37.703133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.703108 2576 scope.go:117] "RemoveContainer" containerID="fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc" Apr 16 18:11:37.704182 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:11:37.703696 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc\": container with ID starting with fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc not found: ID does not exist" containerID="fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc" Apr 16 18:11:37.704182 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.703740 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc"} err="failed to get container status \"fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc\": rpc error: code = NotFound desc = could not find container \"fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc\": container with ID starting with fd30b82844c30d86eedd70c12f5b17b02ffccfdb172cc465f9c4f820fc0246fc not found: ID does not exist" Apr 16 18:11:37.704182 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.703766 2576 scope.go:117] "RemoveContainer" containerID="50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792" Apr 16 18:11:37.704182 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:11:37.704083 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792\": container with ID starting with 50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792 not found: ID does not exist" containerID="50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792" Apr 16 18:11:37.704182 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.704118 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792"} err="failed to get container status \"50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792\": rpc error: code = NotFound desc = could not find container \"50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792\": container with ID starting with 50678d6df451fbac6012142c5c88c5f61b45a0ceb956ab555c1b00e8ea028792 not found: ID does not exist" Apr 16 18:11:37.720261 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.719046 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt"] Apr 16 18:11:37.728417 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:37.728382 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6897fspttt"] Apr 16 18:11:38.627669 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:38.627632 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" path="/var/lib/kubelet/pods/c2ff8466-1bb0-4c26-83ee-de5225b65d1c/volumes" Apr 16 18:11:45.239151 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239111 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g"] Apr 16 18:11:45.239763 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239609 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="storage-initializer" Apr 16 18:11:45.239763 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239630 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="storage-initializer" Apr 16 18:11:45.239763 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239659 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="main" Apr 16 18:11:45.239763 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239672 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="main" Apr 16 18:11:45.239763 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239689 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="tokenizer" Apr 16 18:11:45.239763 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239698 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="tokenizer" Apr 16 18:11:45.240133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239781 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="main" Apr 16 18:11:45.240133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.239796 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2ff8466-1bb0-4c26-83ee-de5225b65d1c" containerName="tokenizer" Apr 16 18:11:45.244395 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.244370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.251942 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.251563 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:11:45.251942 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.251709 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-wglqv\"" Apr 16 18:11:45.251942 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.251809 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:11:45.267318 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.267286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g"] Apr 16 18:11:45.336451 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.331472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.336451 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.331540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg4x8\" (UniqueName: \"kubernetes.io/projected/62b8f263-e78f-43da-81be-1590c9873bf4-kube-api-access-jg4x8\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.336451 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.331599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62b8f263-e78f-43da-81be-1590c9873bf4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.336451 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.331629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.336451 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.331683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.336451 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.331749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433075 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62b8f263-e78f-43da-81be-1590c9873bf4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433075 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433306 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433306 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433306 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433306 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg4x8\" (UniqueName: \"kubernetes.io/projected/62b8f263-e78f-43da-81be-1590c9873bf4-kube-api-access-jg4x8\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433618 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433734 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433734 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.433734 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.433722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.435706 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.435683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62b8f263-e78f-43da-81be-1590c9873bf4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.444361 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.444336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg4x8\" (UniqueName: \"kubernetes.io/projected/62b8f263-e78f-43da-81be-1590c9873bf4-kube-api-access-jg4x8\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.554900 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.554813 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:45.729093 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:45.729065 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g"] Apr 16 18:11:45.730704 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:11:45.730675 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62b8f263_e78f_43da_81be_1590c9873bf4.slice/crio-b156ce732d41c9c0114fa8cf79a2151a4857d568701dcbf43cae083697178f7b WatchSource:0}: Error finding container b156ce732d41c9c0114fa8cf79a2151a4857d568701dcbf43cae083697178f7b: Status 404 returned error can't find the container with id b156ce732d41c9c0114fa8cf79a2151a4857d568701dcbf43cae083697178f7b Apr 16 18:11:46.699785 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:46.699741 2576 generic.go:358] "Generic (PLEG): container finished" podID="62b8f263-e78f-43da-81be-1590c9873bf4" containerID="a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b" exitCode=0 Apr 16 18:11:46.700202 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:46.699827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerDied","Data":"a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b"} Apr 16 18:11:46.700202 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:46.699865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerStarted","Data":"b156ce732d41c9c0114fa8cf79a2151a4857d568701dcbf43cae083697178f7b"} Apr 16 18:11:47.705295 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:47.705258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerStarted","Data":"82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c"} Apr 16 18:11:47.705715 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:47.705311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerStarted","Data":"28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074"} Apr 16 18:11:47.705715 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:47.705366 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:47.739088 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:47.739035 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" podStartSLOduration=2.739017565 podStartE2EDuration="2.739017565s" podCreationTimestamp="2026-04-16 18:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:47.738757885 +0000 UTC m=+1905.755225431" watchObservedRunningTime="2026-04-16 18:11:47.739017565 +0000 UTC m=+1905.755485098" Apr 16 18:11:55.555669 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:55.555620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:55.555669 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:55.555682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:55.558392 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:55.558367 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:11:55.738710 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:11:55.738681 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:12:16.743163 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:12:16.743131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:13:37.160822 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:37.160786 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g"] Apr 16 18:13:37.161293 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:37.161099 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="main" containerID="cri-o://28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074" gracePeriod=30 Apr 16 18:13:37.161293 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:37.161145 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="tokenizer" containerID="cri-o://82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c" gracePeriod=30 Apr 16 18:13:38.137820 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.137780 2576 generic.go:358] "Generic (PLEG): container finished" podID="62b8f263-e78f-43da-81be-1590c9873bf4" containerID="28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074" exitCode=0 Apr 16 18:13:38.138033 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.137852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerDied","Data":"28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074"} Apr 16 18:13:38.514827 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.514798 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:13:38.643097 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643063 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62b8f263-e78f-43da-81be-1590c9873bf4-tls-certs\") pod \"62b8f263-e78f-43da-81be-1590c9873bf4\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " Apr 16 18:13:38.643264 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643129 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-cache\") pod \"62b8f263-e78f-43da-81be-1590c9873bf4\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " Apr 16 18:13:38.643264 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643234 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg4x8\" (UniqueName: \"kubernetes.io/projected/62b8f263-e78f-43da-81be-1590c9873bf4-kube-api-access-jg4x8\") pod \"62b8f263-e78f-43da-81be-1590c9873bf4\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " Apr 16 18:13:38.643372 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643298 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-kserve-provision-location\") pod \"62b8f263-e78f-43da-81be-1590c9873bf4\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " Apr 16 18:13:38.643372 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-tmp\") pod \"62b8f263-e78f-43da-81be-1590c9873bf4\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " Apr 16 18:13:38.643372 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "62b8f263-e78f-43da-81be-1590c9873bf4" (UID: "62b8f263-e78f-43da-81be-1590c9873bf4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:38.643372 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643364 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-uds\") pod \"62b8f263-e78f-43da-81be-1590c9873bf4\" (UID: \"62b8f263-e78f-43da-81be-1590c9873bf4\") " Apr 16 18:13:38.643662 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643642 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-cache\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.643742 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643687 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "62b8f263-e78f-43da-81be-1590c9873bf4" (UID: "62b8f263-e78f-43da-81be-1590c9873bf4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:38.643742 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.643704 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "62b8f263-e78f-43da-81be-1590c9873bf4" (UID: "62b8f263-e78f-43da-81be-1590c9873bf4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:38.644089 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.644065 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "62b8f263-e78f-43da-81be-1590c9873bf4" (UID: "62b8f263-e78f-43da-81be-1590c9873bf4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:38.645278 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.645256 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b8f263-e78f-43da-81be-1590c9873bf4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "62b8f263-e78f-43da-81be-1590c9873bf4" (UID: "62b8f263-e78f-43da-81be-1590c9873bf4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.645380 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.645255 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b8f263-e78f-43da-81be-1590c9873bf4-kube-api-access-jg4x8" (OuterVolumeSpecName: "kube-api-access-jg4x8") pod "62b8f263-e78f-43da-81be-1590c9873bf4" (UID: "62b8f263-e78f-43da-81be-1590c9873bf4"). InnerVolumeSpecName "kube-api-access-jg4x8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:38.744887 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.744798 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-kserve-provision-location\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.744887 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.744844 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-tmp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.744887 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.744861 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/62b8f263-e78f-43da-81be-1590c9873bf4-tokenizer-uds\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.744887 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.744874 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62b8f263-e78f-43da-81be-1590c9873bf4-tls-certs\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.744887 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:38.744888 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jg4x8\" (UniqueName: \"kubernetes.io/projected/62b8f263-e78f-43da-81be-1590c9873bf4-kube-api-access-jg4x8\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:13:39.143354 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.143258 2576 generic.go:358] "Generic (PLEG): container finished" podID="62b8f263-e78f-43da-81be-1590c9873bf4" containerID="82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c" exitCode=0 Apr 16 18:13:39.143354 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.143320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerDied","Data":"82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c"} Apr 16 18:13:39.143354 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.143340 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" Apr 16 18:13:39.143354 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.143354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g" event={"ID":"62b8f263-e78f-43da-81be-1590c9873bf4","Type":"ContainerDied","Data":"b156ce732d41c9c0114fa8cf79a2151a4857d568701dcbf43cae083697178f7b"} Apr 16 18:13:39.143670 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.143374 2576 scope.go:117] "RemoveContainer" containerID="82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c" Apr 16 18:13:39.152596 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.152577 2576 scope.go:117] "RemoveContainer" containerID="28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074" Apr 16 18:13:39.162399 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.162377 2576 scope.go:117] "RemoveContainer" containerID="a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b" Apr 16 18:13:39.170458 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.170437 2576 scope.go:117] "RemoveContainer" containerID="82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c" Apr 16 18:13:39.170743 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:13:39.170726 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c\": container with ID starting with 82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c not found: ID does not exist" containerID="82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c" Apr 16 18:13:39.170788 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.170752 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c"} err="failed to get container status \"82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c\": rpc error: code = NotFound desc = could not find container \"82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c\": container with ID starting with 82e9f325f9d70fc45bfca616039867370d0785e05b0eef7e0d0ee26a0d18536c not found: ID does not exist" Apr 16 18:13:39.170788 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.170770 2576 scope.go:117] "RemoveContainer" containerID="28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074" Apr 16 18:13:39.171033 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:13:39.171015 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074\": container with ID starting with 28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074 not found: ID does not exist" containerID="28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074" Apr 16 18:13:39.171088 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.171040 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074"} err="failed to get container status \"28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074\": rpc error: code = NotFound desc = could not find container \"28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074\": container with ID starting with 28d4e699ccc16755b1a3fd11aac070bd6729abc1c088a2bf7019cf83e726b074 not found: ID does not exist" Apr 16 18:13:39.171088 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.171056 2576 scope.go:117] "RemoveContainer" containerID="a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b" Apr 16 18:13:39.171263 ip-10-0-141-32 kubenswrapper[2576]: E0416 18:13:39.171248 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b\": container with ID starting with a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b not found: ID does not exist" containerID="a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b" Apr 16 18:13:39.171306 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.171269 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b"} err="failed to get container status \"a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b\": rpc error: code = NotFound desc = could not find container \"a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b\": container with ID starting with a006b25f780663336f65d3462fbd84e2ba4465849d145c295b21c5f8676f988b not found: ID does not exist" Apr 16 18:13:39.194300 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.194267 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g"] Apr 16 18:13:39.196432 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.196410 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5cf4d7c5cfj27g"] Apr 16 18:13:39.312738 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.312704 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w62rw/must-gather-xmz65"] Apr 16 18:13:39.313072 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313058 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="storage-initializer" Apr 16 18:13:39.313133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313073 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="storage-initializer" Apr 16 18:13:39.313133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313081 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="tokenizer" Apr 16 18:13:39.313133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313086 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="tokenizer" Apr 16 18:13:39.313133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313100 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="main" Apr 16 18:13:39.313133 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313106 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="main" Apr 16 18:13:39.313294 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313165 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="main" Apr 16 18:13:39.313294 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.313175 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" containerName="tokenizer" Apr 16 18:13:39.317837 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.317819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.321245 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.321227 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w62rw\"/\"kube-root-ca.crt\"" Apr 16 18:13:39.321505 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.321474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-w62rw\"/\"default-dockercfg-v7npx\"" Apr 16 18:13:39.321505 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.321483 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w62rw\"/\"openshift-service-ca.crt\"" Apr 16 18:13:39.328404 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.328382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w62rw/must-gather-xmz65"] Apr 16 18:13:39.452066 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.452028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e400b269-cfdb-4931-88d7-4957fec5642b-must-gather-output\") pod \"must-gather-xmz65\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.452236 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.452078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l88tp\" (UniqueName: \"kubernetes.io/projected/e400b269-cfdb-4931-88d7-4957fec5642b-kube-api-access-l88tp\") pod \"must-gather-xmz65\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.553576 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.553524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e400b269-cfdb-4931-88d7-4957fec5642b-must-gather-output\") pod \"must-gather-xmz65\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.554021 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.553608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l88tp\" (UniqueName: \"kubernetes.io/projected/e400b269-cfdb-4931-88d7-4957fec5642b-kube-api-access-l88tp\") pod \"must-gather-xmz65\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.554021 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.553855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e400b269-cfdb-4931-88d7-4957fec5642b-must-gather-output\") pod \"must-gather-xmz65\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.573870 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.573842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l88tp\" (UniqueName: \"kubernetes.io/projected/e400b269-cfdb-4931-88d7-4957fec5642b-kube-api-access-l88tp\") pod \"must-gather-xmz65\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.640297 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.640257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:13:39.778334 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:39.778313 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w62rw/must-gather-xmz65"] Apr 16 18:13:39.780563 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:13:39.780529 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode400b269_cfdb_4931_88d7_4957fec5642b.slice/crio-00dc35a12e842b6da8a92eadf00b4456a8612c38f67f5e86eb01a628d9dafac7 WatchSource:0}: Error finding container 00dc35a12e842b6da8a92eadf00b4456a8612c38f67f5e86eb01a628d9dafac7: Status 404 returned error can't find the container with id 00dc35a12e842b6da8a92eadf00b4456a8612c38f67f5e86eb01a628d9dafac7 Apr 16 18:13:40.149077 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:40.149038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w62rw/must-gather-xmz65" event={"ID":"e400b269-cfdb-4931-88d7-4957fec5642b","Type":"ContainerStarted","Data":"00dc35a12e842b6da8a92eadf00b4456a8612c38f67f5e86eb01a628d9dafac7"} Apr 16 18:13:40.630007 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:40.629424 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b8f263-e78f-43da-81be-1590c9873bf4" path="/var/lib/kubelet/pods/62b8f263-e78f-43da-81be-1590c9873bf4/volumes" Apr 16 18:13:44.168895 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:44.168866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w62rw/must-gather-xmz65" event={"ID":"e400b269-cfdb-4931-88d7-4957fec5642b","Type":"ContainerStarted","Data":"7358d6f9d9de4c9d201ca99825cd7ed30de40320f4eb1475050aeb47b8d8fdb3"} Apr 16 18:13:45.174452 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:45.174401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w62rw/must-gather-xmz65" event={"ID":"e400b269-cfdb-4931-88d7-4957fec5642b","Type":"ContainerStarted","Data":"2dd6ccceb1efb1aae40ad5bb70f113db4e303667ff145b4eb079e7ba3fdb11b2"} Apr 16 18:13:45.195947 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:45.195878 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w62rw/must-gather-xmz65" podStartSLOduration=1.938541165 podStartE2EDuration="6.195862618s" podCreationTimestamp="2026-04-16 18:13:39 +0000 UTC" firstStartedPulling="2026-04-16 18:13:39.782330049 +0000 UTC m=+2017.798797549" lastFinishedPulling="2026-04-16 18:13:44.039651489 +0000 UTC m=+2022.056119002" observedRunningTime="2026-04-16 18:13:45.193706669 +0000 UTC m=+2023.210174190" watchObservedRunningTime="2026-04-16 18:13:45.195862618 +0000 UTC m=+2023.212330164" Apr 16 18:13:53.850081 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:53.850049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:13:53.880315 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:53.880272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:13:55.016545 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:55.016513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:13:55.039752 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:55.039724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:13:56.152414 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:56.152384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:13:56.169773 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:56.169737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:13:57.225297 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:57.225263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:13:57.243202 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:57.243178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:13:58.290770 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:58.290742 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:13:58.309310 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:58.309287 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:13:59.436739 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:59.436711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:13:59.458073 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:13:59.458045 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:00.572552 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:00.572524 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:00.594344 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:00.594318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:01.701030 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:01.701001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:01.728091 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:01.728060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:02.840981 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:02.840940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:02.860305 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:02.860276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:04.075126 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:04.075099 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:04.102120 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:04.102090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:05.256101 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:05.256073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:05.278175 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:05.278133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:06.366665 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:06.366634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:06.393056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:06.393023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:07.521116 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:07.521085 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:07.540690 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:07.540663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:08.717800 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:08.717766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-98r99_6b214e90-043e-4e8e-8231-cf881f4d0988/istio-proxy/0.log" Apr 16 18:14:08.738654 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:08.738622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jdq4f_cdff9bd6-0a70-4825-8d1e-3488bf21909f/istio-proxy/0.log" Apr 16 18:14:09.765696 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:09.765664 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4xmn4_e37d1415-b492-405e-9870-3b7c399f5c0d/discovery/0.log" Apr 16 18:14:09.806106 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:09.806072 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4ltxc_bebca98f-660f-43f3-90d4-f1020f766e4f/istio-proxy/0.log" Apr 16 18:14:09.836337 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:09.836311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-587cd5bb74-wz624_2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a/router/0.log" Apr 16 18:14:10.751961 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:10.751929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4xmn4_e37d1415-b492-405e-9870-3b7c399f5c0d/discovery/0.log" Apr 16 18:14:10.775216 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:10.775191 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4ltxc_bebca98f-660f-43f3-90d4-f1020f766e4f/istio-proxy/0.log" Apr 16 18:14:10.812361 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:10.812336 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-587cd5bb74-wz624_2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a/router/0.log" Apr 16 18:14:11.761011 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:11.760983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bw45t_b9a6220d-cf89-4be1-adf4-f1bd40e9b09e/manager/0.log" Apr 16 18:14:11.795856 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:11.795833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-7j8n5_f28ebcd1-ba12-4f95-ba66-e35818bba52a/manager/0.log" Apr 16 18:14:13.296515 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:13.296471 2576 generic.go:358] "Generic (PLEG): container finished" podID="e400b269-cfdb-4931-88d7-4957fec5642b" containerID="7358d6f9d9de4c9d201ca99825cd7ed30de40320f4eb1475050aeb47b8d8fdb3" exitCode=0 Apr 16 18:14:13.296956 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:13.296544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w62rw/must-gather-xmz65" event={"ID":"e400b269-cfdb-4931-88d7-4957fec5642b","Type":"ContainerDied","Data":"7358d6f9d9de4c9d201ca99825cd7ed30de40320f4eb1475050aeb47b8d8fdb3"} Apr 16 18:14:13.297008 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:13.296961 2576 scope.go:117] "RemoveContainer" containerID="7358d6f9d9de4c9d201ca99825cd7ed30de40320f4eb1475050aeb47b8d8fdb3" Apr 16 18:14:13.573373 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:13.573288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w62rw_must-gather-xmz65_e400b269-cfdb-4931-88d7-4957fec5642b/gather/0.log" Apr 16 18:14:17.632666 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:17.632637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hxk67_908ddae4-92b4-4772-acb3-f93ed0f019ea/global-pull-secret-syncer/0.log" Apr 16 18:14:17.785951 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:17.785924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dbzll_eb3b87bf-55de-44cd-a182-ab40925c246f/konnectivity-agent/0.log" Apr 16 18:14:17.894936 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:17.894841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-32.ec2.internal_f34ad151e5f273367866fd58bbc327be/haproxy/0.log" Apr 16 18:14:19.156689 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.156656 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w62rw/must-gather-xmz65"] Apr 16 18:14:19.157115 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.156871 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-w62rw/must-gather-xmz65" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="copy" containerID="cri-o://2dd6ccceb1efb1aae40ad5bb70f113db4e303667ff145b4eb079e7ba3fdb11b2" gracePeriod=2 Apr 16 18:14:19.163930 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.163894 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w62rw/must-gather-xmz65"] Apr 16 18:14:19.322273 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.322247 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w62rw_must-gather-xmz65_e400b269-cfdb-4931-88d7-4957fec5642b/copy/0.log" Apr 16 18:14:19.322639 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.322608 2576 generic.go:358] "Generic (PLEG): container finished" podID="e400b269-cfdb-4931-88d7-4957fec5642b" containerID="2dd6ccceb1efb1aae40ad5bb70f113db4e303667ff145b4eb079e7ba3fdb11b2" exitCode=143 Apr 16 18:14:19.391993 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.391967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w62rw_must-gather-xmz65_e400b269-cfdb-4931-88d7-4957fec5642b/copy/0.log" Apr 16 18:14:19.392346 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.392330 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:14:19.394751 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.394727 2576 status_manager.go:895] "Failed to get status for pod" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" pod="openshift-must-gather-w62rw/must-gather-xmz65" err="pods \"must-gather-xmz65\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-w62rw\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" Apr 16 18:14:19.516946 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.516829 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l88tp\" (UniqueName: \"kubernetes.io/projected/e400b269-cfdb-4931-88d7-4957fec5642b-kube-api-access-l88tp\") pod \"e400b269-cfdb-4931-88d7-4957fec5642b\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " Apr 16 18:14:19.517106 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.517051 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e400b269-cfdb-4931-88d7-4957fec5642b-must-gather-output\") pod \"e400b269-cfdb-4931-88d7-4957fec5642b\" (UID: \"e400b269-cfdb-4931-88d7-4957fec5642b\") " Apr 16 18:14:19.519151 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.519116 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e400b269-cfdb-4931-88d7-4957fec5642b-kube-api-access-l88tp" (OuterVolumeSpecName: "kube-api-access-l88tp") pod "e400b269-cfdb-4931-88d7-4957fec5642b" (UID: "e400b269-cfdb-4931-88d7-4957fec5642b"). InnerVolumeSpecName "kube-api-access-l88tp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:19.522880 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.522854 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e400b269-cfdb-4931-88d7-4957fec5642b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e400b269-cfdb-4931-88d7-4957fec5642b" (UID: "e400b269-cfdb-4931-88d7-4957fec5642b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:19.618424 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.618382 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e400b269-cfdb-4931-88d7-4957fec5642b-must-gather-output\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:14:19.618424 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:19.618417 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l88tp\" (UniqueName: \"kubernetes.io/projected/e400b269-cfdb-4931-88d7-4957fec5642b-kube-api-access-l88tp\") on node \"ip-10-0-141-32.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.328300 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.328271 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w62rw_must-gather-xmz65_e400b269-cfdb-4931-88d7-4957fec5642b/copy/0.log" Apr 16 18:14:20.328698 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.328670 2576 scope.go:117] "RemoveContainer" containerID="2dd6ccceb1efb1aae40ad5bb70f113db4e303667ff145b4eb079e7ba3fdb11b2" Apr 16 18:14:20.328746 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.328672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w62rw/must-gather-xmz65" Apr 16 18:14:20.331751 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.331718 2576 status_manager.go:895] "Failed to get status for pod" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" pod="openshift-must-gather-w62rw/must-gather-xmz65" err="pods \"must-gather-xmz65\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-w62rw\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" Apr 16 18:14:20.337799 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.337779 2576 scope.go:117] "RemoveContainer" containerID="7358d6f9d9de4c9d201ca99825cd7ed30de40320f4eb1475050aeb47b8d8fdb3" Apr 16 18:14:20.341651 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.341620 2576 status_manager.go:895] "Failed to get status for pod" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" pod="openshift-must-gather-w62rw/must-gather-xmz65" err="pods \"must-gather-xmz65\" is forbidden: User \"system:node:ip-10-0-141-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-w62rw\": no relationship found between node 'ip-10-0-141-32.ec2.internal' and this object" Apr 16 18:14:20.627963 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:20.627867 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" path="/var/lib/kubelet/pods/e400b269-cfdb-4931-88d7-4957fec5642b/volumes" Apr 16 18:14:22.075619 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:22.075588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bw45t_b9a6220d-cf89-4be1-adf4-f1bd40e9b09e/manager/0.log" Apr 16 18:14:22.150881 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:22.150844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-7j8n5_f28ebcd1-ba12-4f95-ba66-e35818bba52a/manager/0.log" Apr 16 18:14:23.440889 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:23.440854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/1.log" Apr 16 18:14:23.513634 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:23.513603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-gvbhc_1b5a2170-b541-4c6d-918a-c836b3286e61/cluster-monitoring-operator/0.log" Apr 16 18:14:23.814134 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:23.814056 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j556q_b08bc540-ee52-405a-91b1-6b666ac80a17/node-exporter/0.log" Apr 16 18:14:23.838894 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:23.838866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j556q_b08bc540-ee52-405a-91b1-6b666ac80a17/kube-rbac-proxy/0.log" Apr 16 18:14:23.865600 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:23.865571 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j556q_b08bc540-ee52-405a-91b1-6b666ac80a17/init-textfile/0.log" Apr 16 18:14:24.429440 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:24.429411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-plrpx_9f2ec8b8-30a4-4029-bcbf-b65983bf98df/prometheus-operator-admission-webhook/0.log" Apr 16 18:14:26.242411 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.242378 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk"] Apr 16 18:14:26.242900 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.242878 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="copy" Apr 16 18:14:26.243014 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.242928 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="copy" Apr 16 18:14:26.243014 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.242961 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="gather" Apr 16 18:14:26.243014 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.242970 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="gather" Apr 16 18:14:26.243166 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.243074 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="gather" Apr 16 18:14:26.243166 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.243086 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e400b269-cfdb-4931-88d7-4957fec5642b" containerName="copy" Apr 16 18:14:26.249413 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.249387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.252959 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.252933 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kw59r\"/\"kube-root-ca.crt\"" Apr 16 18:14:26.253082 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.253032 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kw59r\"/\"default-dockercfg-tg6h4\"" Apr 16 18:14:26.254049 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.254031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kw59r\"/\"openshift-service-ca.crt\"" Apr 16 18:14:26.268964 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.268929 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk"] Apr 16 18:14:26.383778 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.383744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-sys\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.383778 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.383780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-proc\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.384063 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.383815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-lib-modules\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.384063 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.383970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-podres\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.384063 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.384022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldwkv\" (UniqueName: \"kubernetes.io/projected/5a42c7a9-8577-4eea-81c1-18b2dd433150-kube-api-access-ldwkv\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484458 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldwkv\" (UniqueName: \"kubernetes.io/projected/5a42c7a9-8577-4eea-81c1-18b2dd433150-kube-api-access-ldwkv\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484648 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-sys\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484648 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-proc\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484648 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-lib-modules\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484648 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-proc\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484874 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-sys\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484874 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-podres\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484874 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-lib-modules\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.484874 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.484739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a42c7a9-8577-4eea-81c1-18b2dd433150-podres\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.495397 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.495338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldwkv\" (UniqueName: \"kubernetes.io/projected/5a42c7a9-8577-4eea-81c1-18b2dd433150-kube-api-access-ldwkv\") pod \"perf-node-gather-daemonset-rgrkk\" (UID: \"5a42c7a9-8577-4eea-81c1-18b2dd433150\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.559790 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.559762 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:26.696572 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:26.696542 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk"] Apr 16 18:14:26.697416 ip-10-0-141-32 kubenswrapper[2576]: W0416 18:14:26.697389 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5a42c7a9_8577_4eea_81c1_18b2dd433150.slice/crio-d7e79c941361aa7bfba7c6c564a103082203297738693208060c2afac8417748 WatchSource:0}: Error finding container d7e79c941361aa7bfba7c6c564a103082203297738693208060c2afac8417748: Status 404 returned error can't find the container with id d7e79c941361aa7bfba7c6c564a103082203297738693208060c2afac8417748 Apr 16 18:14:27.171513 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:27.171481 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d867bf5b5-fp8rs_1d6887bc-23dd-449c-aecc-8adedad8c860/console/0.log" Apr 16 18:14:27.207050 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:27.207019 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-r58vb_421e5aad-dcec-441f-b232-eff2d7c6af79/download-server/0.log" Apr 16 18:14:27.360151 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:27.360114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" event={"ID":"5a42c7a9-8577-4eea-81c1-18b2dd433150","Type":"ContainerStarted","Data":"31b42843f593b02b53cc0d8693d9758df64c75677a24ce0291d7fff98141a033"} Apr 16 18:14:27.360151 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:27.360153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" event={"ID":"5a42c7a9-8577-4eea-81c1-18b2dd433150","Type":"ContainerStarted","Data":"d7e79c941361aa7bfba7c6c564a103082203297738693208060c2afac8417748"} Apr 16 18:14:27.360684 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:27.360232 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:27.386214 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:27.386162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" podStartSLOduration=1.386145064 podStartE2EDuration="1.386145064s" podCreationTimestamp="2026-04-16 18:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:27.38319754 +0000 UTC m=+2065.399665071" watchObservedRunningTime="2026-04-16 18:14:27.386145064 +0000 UTC m=+2065.402612585" Apr 16 18:14:28.737602 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:28.737578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vszkl_643af40e-aabc-4fb6-8e21-7926a029b0a0/dns/0.log" Apr 16 18:14:28.763861 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:28.763836 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vszkl_643af40e-aabc-4fb6-8e21-7926a029b0a0/kube-rbac-proxy/0.log" Apr 16 18:14:28.817892 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:28.817863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p4dbv_559281b5-6292-4642-95ef-022daeacb46b/dns-node-resolver/0.log" Apr 16 18:14:29.386253 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:29.386223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-96578b4d-66dzw_e9d84c5e-5875-4d83-9674-ab0c1accad47/registry/0.log" Apr 16 18:14:29.410054 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:29.410026 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cbg2b_77172d03-834d-4c9b-8b9c-2d2f57a663cd/node-ca/0.log" Apr 16 18:14:30.412520 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:30.412492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4xmn4_e37d1415-b492-405e-9870-3b7c399f5c0d/discovery/0.log" Apr 16 18:14:30.437826 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:30.437794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-4ltxc_bebca98f-660f-43f3-90d4-f1020f766e4f/istio-proxy/0.log" Apr 16 18:14:30.467856 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:30.467829 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-587cd5bb74-wz624_2c0dcd7e-0c96-40a1-a6fd-2c9e2109db1a/router/0.log" Apr 16 18:14:31.015056 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:31.015028 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pc75n_3bf148cf-abbf-4345-8487-fd40ceff855c/serve-healthcheck-canary/0.log" Apr 16 18:14:31.549492 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:31.549466 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-5t6rd_d2bce270-4a99-4e0a-a841-56b07aa9d0cb/insights-operator/0.log" Apr 16 18:14:31.797274 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:31.797244 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x72lh_d05422b4-dafd-47da-b046-07325a713255/kube-rbac-proxy/0.log" Apr 16 18:14:31.832410 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:31.832333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x72lh_d05422b4-dafd-47da-b046-07325a713255/exporter/0.log" Apr 16 18:14:31.866245 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:31.866217 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x72lh_d05422b4-dafd-47da-b046-07325a713255/extractor/0.log" Apr 16 18:14:33.374513 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:33.374487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-rgrkk" Apr 16 18:14:40.645755 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:40.645717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-5vrzw_d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af/migrator/0.log" Apr 16 18:14:40.671276 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:40.671246 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-5vrzw_d6d3f0ae-15c7-4a35-80c9-6fae8ac4f2af/graceful-termination/0.log" Apr 16 18:14:42.160888 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.160841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8nf86_d10122cd-f300-4191-95af-3535482c3187/kube-multus/0.log" Apr 16 18:14:42.570222 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.570143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/kube-multus-additional-cni-plugins/0.log" Apr 16 18:14:42.600652 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.600627 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/egress-router-binary-copy/0.log" Apr 16 18:14:42.625878 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.625843 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/cni-plugins/0.log" Apr 16 18:14:42.650785 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.650754 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/bond-cni-plugin/0.log" Apr 16 18:14:42.677479 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.677455 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/routeoverride-cni/0.log" Apr 16 18:14:42.705903 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.705881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/whereabouts-cni-bincopy/0.log" Apr 16 18:14:42.731255 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.731235 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t42f2_404aadc7-59c7-4274-841e-38902a95c670/whereabouts-cni/0.log" Apr 16 18:14:42.892744 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.892667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tw2xb_98ed775b-36f2-475e-9c1b-e1e3a5261ed5/network-metrics-daemon/0.log" Apr 16 18:14:42.914771 ip-10-0-141-32 kubenswrapper[2576]: I0416 18:14:42.914747 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tw2xb_98ed775b-36f2-475e-9c1b-e1e3a5261ed5/kube-rbac-proxy/0.log"