Apr 17 14:31:28.035323 ip-10-0-143-171 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:31:28.035339 ip-10-0-143-171 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:31:28.035349 ip-10-0-143-171 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:31:28.035675 ip-10-0-143-171 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:31:38.252921 ip-10-0-143-171 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:31:38.252938 ip-10-0-143-171 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bbaff3f850024fe9b24f81815e569eb1 -- Apr 17 14:34:07.011112 ip-10-0-143-171 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:34:07.484133 ip-10-0-143-171 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:34:07.484133 ip-10-0-143-171 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:34:07.484133 ip-10-0-143-171 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:34:07.484133 ip-10-0-143-171 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:34:07.484133 ip-10-0-143-171 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:34:07.486170 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.486079 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:34:07.491397 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491372 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:34:07.491397 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491391 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:34:07.491397 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491396 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:34:07.491397 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491401 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:34:07.491397 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491405 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491409 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491413 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491417 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491421 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491425 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491430 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491435 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491439 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491443 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491447 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491451 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491455 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491459 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491463 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491468 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491472 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491476 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491480 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:34:07.491685 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491501 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491505 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491509 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491513 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491517 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491521 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491525 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491529 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491533 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491537 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491541 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491545 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491549 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491554 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491558 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491563 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491568 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491572 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491576 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491580 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:34:07.492455 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491585 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491589 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491595 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491599 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491604 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491608 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491613 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491617 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491621 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491625 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491630 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491634 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491639 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491643 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491646 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491651 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491654 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491658 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491662 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491667 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:34:07.493173 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491671 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491675 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491680 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491684 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491689 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491693 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491700 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491704 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491708 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491712 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491719 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491731 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491735 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491739 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491743 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491747 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491751 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491755 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491759 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491763 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:34:07.493692 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491767 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491771 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.491775 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492417 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492425 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492431 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492436 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492440 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492444 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492448 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492452 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492456 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492460 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492463 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492467 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492472 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492476 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492480 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492484 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492488 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:34:07.494559 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492492 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492497 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492501 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492506 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492510 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492514 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492518 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492522 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492526 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492530 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492534 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492538 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492543 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492547 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492551 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492555 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492559 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492563 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492566 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:34:07.495351 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492571 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492575 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492579 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492584 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492588 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492592 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492596 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492602 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492606 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492611 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492616 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492620 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492624 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492628 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492632 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492636 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492642 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492649 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492654 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492658 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:34:07.495846 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492662 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492666 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492670 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492674 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492681 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492687 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492692 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492696 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492701 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492705 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492709 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492713 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492718 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492722 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492726 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492731 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492735 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492739 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492745 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:34:07.496440 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492751 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492755 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492760 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492764 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492768 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492772 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492777 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492781 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492785 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492789 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.492793 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493753 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493776 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493786 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493793 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493822 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493827 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493834 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493841 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493846 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493851 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493856 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:34:07.497098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493862 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493867 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493872 2582 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493877 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493881 2582 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493886 2582 flags.go:64] FLAG: --cloud-config="" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493891 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493895 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493903 2582 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493908 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493914 2582 flags.go:64] FLAG: --config-dir="" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493919 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493929 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493936 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493940 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493946 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493951 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493956 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493961 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493965 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493971 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493976 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493984 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493989 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493994 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:34:07.497706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.493999 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494003 2582 flags.go:64] FLAG: --enable-server="true" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494008 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494015 2582 flags.go:64] FLAG: --event-burst="100" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494020 2582 flags.go:64] FLAG: --event-qps="50" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494024 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494030 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494034 2582 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494048 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494053 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494058 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494062 2582 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494067 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494072 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494076 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494081 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494086 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494091 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494096 2582 flags.go:64] FLAG: --feature-gates="" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494108 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494113 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494119 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494124 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494129 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494134 2582 flags.go:64] FLAG: --help="false" Apr 17 14:34:07.498456 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494139 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494144 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494148 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494153 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494159 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494165 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494170 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494175 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494179 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494184 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494189 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494194 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494199 2582 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494203 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494208 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494213 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494217 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494222 2582 flags.go:64] FLAG: --lock-file="" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494226 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494231 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494237 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494246 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494251 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494256 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:34:07.499097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494261 2582 flags.go:64] FLAG: --logging-format="text" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494266 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494273 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494277 2582 flags.go:64] FLAG: --manifest-url="" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494282 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494290 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494294 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494301 2582 flags.go:64] FLAG: --max-pods="110" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494306 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494311 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494315 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494320 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494325 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494330 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494336 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494349 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494354 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494359 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494364 2582 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494368 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494378 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494383 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494387 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494392 2582 flags.go:64] FLAG: --port="10250" Apr 17 14:34:07.499731 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494397 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494402 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02fb635044235ce6e" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494407 2582 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494412 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494417 2582 flags.go:64] FLAG: --register-node="true" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494422 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494427 2582 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494432 2582 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494439 2582 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494443 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494448 2582 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494457 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494462 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494467 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494472 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494476 2582 flags.go:64] FLAG: --runonce="false" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494481 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494486 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494491 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494496 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494500 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494505 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494510 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494516 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494520 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494525 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:34:07.500323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494530 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494535 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494540 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494545 2582 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494549 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494558 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494562 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494566 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494573 2582 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494577 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494581 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494586 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494591 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494596 2582 flags.go:64] FLAG: --v="2" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494603 2582 flags.go:64] FLAG: --version="false" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494609 2582 flags.go:64] FLAG: --vmodule="" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494616 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.494621 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494787 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494795 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494818 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494823 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494827 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494831 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:34:07.500986 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494836 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494840 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494844 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494848 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494852 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494855 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494860 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494865 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494869 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494873 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494878 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494882 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494885 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494889 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494893 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494897 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494901 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494905 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494909 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494914 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:34:07.501620 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494918 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494922 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494926 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494931 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494935 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494939 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494943 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494948 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494952 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494956 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494960 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494964 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494968 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494972 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494976 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494980 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494984 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494989 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494993 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.494998 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:34:07.502324 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495003 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495007 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495011 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495016 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495020 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495024 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495027 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495032 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495035 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495039 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495043 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495047 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495052 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495055 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495060 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495064 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495070 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495077 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495081 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:34:07.502837 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495085 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495090 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495094 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495098 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495102 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495106 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495110 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495114 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495120 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495126 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495131 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495136 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495141 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495146 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495150 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495154 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495159 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495163 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495167 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:34:07.503341 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495171 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.495175 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.496000 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.503534 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.503553 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503606 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503612 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503615 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503619 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503622 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503624 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503627 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503630 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503632 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503635 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:34:07.503819 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503638 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503641 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503643 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503646 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503648 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503651 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503653 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503656 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503658 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503661 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503663 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503666 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503669 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503671 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503674 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503676 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503679 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503682 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503685 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503689 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:34:07.504203 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503693 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503697 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503700 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503703 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503706 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503709 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503711 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503714 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503717 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503720 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503722 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503725 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503729 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503733 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503736 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503738 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503740 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503743 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503745 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:34:07.504681 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503748 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503750 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503753 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503755 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503757 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503761 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503764 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503766 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503780 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503783 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503785 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503788 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503790 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503793 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503796 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503813 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503816 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503818 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503821 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503824 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:34:07.505180 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503827 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503830 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503832 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503835 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503838 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503840 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503843 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503845 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503848 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503851 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503853 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503856 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503859 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503861 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503864 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503866 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:34:07.505672 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503870 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.503875 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503975 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503979 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503983 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503985 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503988 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503991 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503993 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.503997 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504002 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504005 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504008 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504011 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504014 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:34:07.506144 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504017 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504019 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504022 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504024 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504027 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504029 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504032 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504035 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504037 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504040 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504042 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504045 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504047 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504049 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504052 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504054 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504057 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504059 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504062 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504065 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:34:07.506517 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504068 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504070 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504073 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504076 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504078 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504081 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504084 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504088 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504091 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504094 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504096 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504099 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504101 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504103 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504106 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504108 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504111 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504113 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504116 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:34:07.507062 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504118 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504120 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504123 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504125 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504128 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504130 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504132 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504135 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504137 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504140 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504142 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504145 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504147 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504150 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504153 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504155 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504158 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504160 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504163 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504165 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:34:07.507522 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504168 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504171 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504173 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504176 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504178 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504180 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504183 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504185 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504188 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504190 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504193 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504195 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504198 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.504200 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.504205 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:34:07.508123 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.505303 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:34:07.508504 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.507492 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:34:07.508741 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.508727 2582 server.go:1019] "Starting client certificate rotation" Apr 17 14:34:07.508865 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.508846 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:34:07.508908 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.508893 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:34:07.531866 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.531837 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:34:07.534917 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.534894 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:34:07.554417 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.554391 2582 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:34:07.559981 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.559956 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:34:07.560933 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.560917 2582 log.go:25] "Validated CRI v1 image API" Apr 17 14:34:07.562834 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.562816 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:34:07.566567 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.566536 2582 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a959cddb-9dd2-4e98-956c-dbb4fe186826:/dev/nvme0n1p3 f707006a-ef45-4e0b-ad63-7fcc37e3d6e2:/dev/nvme0n1p4] Apr 17 14:34:07.566629 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.566565 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:34:07.572586 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.572462 2582 manager.go:217] Machine: {Timestamp:2026-04-17 14:34:07.570381335 +0000 UTC m=+0.437766977 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100516 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec294af1b4c12146f6f5fbf7b3bfe696 SystemUUID:ec294af1-b4c1-2146-f6f5-fbf7b3bfe696 BootID:bbaff3f8-5002-4fe9-b24f-81815e569eb1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:30:22:77:3e:5f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:30:22:77:3e:5f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:8e:3c:2a:62:32 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:34:07.572586 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.572580 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:34:07.572702 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.572673 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:34:07.574075 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.574048 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:34:07.574219 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.574078 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-171.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:34:07.574267 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.574230 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:34:07.574267 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.574238 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:34:07.574267 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.574251 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:34:07.575043 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.575033 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:34:07.575579 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.575562 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9ddfw" Apr 17 14:34:07.576334 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.576324 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:34:07.576460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.576450 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:34:07.579057 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.579047 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:34:07.579092 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.579061 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:34:07.579092 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.579074 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:34:07.579092 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.579086 2582 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:34:07.579213 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.579096 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:34:07.580248 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.580236 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:34:07.580295 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.580255 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:34:07.582717 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.582700 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9ddfw" Apr 17 14:34:07.584197 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.584169 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:34:07.586179 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.586155 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:34:07.588261 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588240 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:34:07.588261 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588260 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:34:07.588261 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588267 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588273 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588279 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588285 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588313 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588327 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588335 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588342 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588351 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:34:07.588471 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.588362 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:34:07.590323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.590294 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:34:07.590387 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.590332 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:34:07.593843 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.593827 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:07.594740 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.594726 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:34:07.594839 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.594770 2582 server.go:1295] "Started kubelet" Apr 17 14:34:07.594922 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.594872 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:34:07.595048 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.594995 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:34:07.595104 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.595057 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:34:07.595605 ip-10-0-143-171 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:34:07.596935 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.596916 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:07.597022 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.596921 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:34:07.597765 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.597752 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:34:07.599149 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.599131 2582 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-171.ec2.internal" not found Apr 17 14:34:07.602649 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.602626 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:34:07.602823 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.602782 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:34:07.603172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603158 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:34:07.603789 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603772 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:34:07.603789 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603792 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:34:07.603932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603771 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:34:07.603932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603836 2582 factory.go:55] Registering systemd factory Apr 17 14:34:07.603932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603851 2582 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:34:07.603932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603925 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:34:07.603932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.603933 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:34:07.604090 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.603938 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-171.ec2.internal\" not found" Apr 17 14:34:07.604281 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.604266 2582 factory.go:153] Registering CRI-O factory Apr 17 14:34:07.604318 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.604285 2582 factory.go:223] Registration of the crio container factory successfully Apr 17 14:34:07.604345 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.604334 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:34:07.604377 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.604361 2582 factory.go:103] Registering Raw factory Apr 17 14:34:07.604404 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.604376 2582 manager.go:1196] Started watching for new ooms in manager Apr 17 14:34:07.604845 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.604829 2582 manager.go:319] Starting recovery of all containers Apr 17 14:34:07.605786 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.605763 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:07.608292 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.608147 2582 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-171.ec2.internal\" not found" node="ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.614183 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.614031 2582 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-171.ec2.internal" not found Apr 17 14:34:07.615336 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:07.615282 2582 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.max": open /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.max: no such device Apr 17 14:34:07.615966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.615947 2582 manager.go:324] Recovery completed Apr 17 14:34:07.621029 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.621016 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:34:07.623794 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.623777 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:34:07.623893 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.623823 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:34:07.623893 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.623838 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:34:07.624325 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.624311 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:34:07.624363 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.624326 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:34:07.624363 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.624344 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:34:07.626502 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.626488 2582 policy_none.go:49] "None policy: Start" Apr 17 14:34:07.626502 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.626504 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:34:07.626607 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.626513 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:34:07.667496 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.667476 2582 manager.go:341] "Starting Device Plugin manager" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.667511 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.667521 2582 server.go:85] "Starting device plugin registration server" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.667795 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.667826 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.667928 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.668003 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.668013 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.668624 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.668659 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-171.ec2.internal\" not found" Apr 17 14:34:07.682426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.672308 2582 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-171.ec2.internal" not found Apr 17 14:34:07.736391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.736285 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:34:07.737706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.737685 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:34:07.737770 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.737725 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:34:07.737770 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.737751 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:34:07.737770 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.737759 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:34:07.738013 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.737991 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:34:07.740061 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.740038 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:07.768926 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.768868 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:34:07.770010 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.769993 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:34:07.770104 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.770030 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:34:07.770104 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.770041 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:34:07.770104 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.770083 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.778998 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.778975 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.778998 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:07.778998 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-171.ec2.internal\": node \"ip-10-0-143-171.ec2.internal\" not found" Apr 17 14:34:07.838925 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.838879 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal"] Apr 17 14:34:07.843399 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.843379 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.843478 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.843392 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.872606 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.872574 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.877157 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.877140 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:07.890387 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.890368 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:34:07.890487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:07.890372 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:34:08.005909 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.005829 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff484d21f6659085ac06da34cbc27dec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal\" (UID: \"ff484d21f6659085ac06da34cbc27dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.005909 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.005858 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/342031fe6556563fbef6e8d55c3a781f-config\") pod \"kube-apiserver-proxy-ip-10-0-143-171.ec2.internal\" (UID: \"342031fe6556563fbef6e8d55c3a781f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.005909 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.005878 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ff484d21f6659085ac06da34cbc27dec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal\" (UID: \"ff484d21f6659085ac06da34cbc27dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.106545 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.106513 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff484d21f6659085ac06da34cbc27dec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal\" (UID: \"ff484d21f6659085ac06da34cbc27dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.106545 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.106548 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/342031fe6556563fbef6e8d55c3a781f-config\") pod \"kube-apiserver-proxy-ip-10-0-143-171.ec2.internal\" (UID: \"342031fe6556563fbef6e8d55c3a781f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.106752 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.106575 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ff484d21f6659085ac06da34cbc27dec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal\" (UID: \"ff484d21f6659085ac06da34cbc27dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.106752 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.106607 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff484d21f6659085ac06da34cbc27dec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal\" (UID: \"ff484d21f6659085ac06da34cbc27dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.106752 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.106607 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ff484d21f6659085ac06da34cbc27dec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal\" (UID: \"ff484d21f6659085ac06da34cbc27dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.106752 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.106610 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/342031fe6556563fbef6e8d55c3a781f-config\") pod \"kube-apiserver-proxy-ip-10-0-143-171.ec2.internal\" (UID: \"342031fe6556563fbef6e8d55c3a781f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.193764 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.193723 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.193888 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.193711 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" Apr 17 14:34:08.508115 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.508095 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:34:08.508693 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.508224 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:34:08.508693 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.508228 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:34:08.508693 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.508249 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:34:08.579707 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.579677 2582 apiserver.go:52] "Watching apiserver" Apr 17 14:34:08.584047 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.584020 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:29:07 +0000 UTC" deadline="2028-01-28 14:25:03.653848722 +0000 UTC" Apr 17 14:34:08.584113 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.584045 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15623h50m55.069805648s" Apr 17 14:34:08.586401 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.586385 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:34:08.586856 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.586831 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-m5qlh","openshift-image-registry/node-ca-rhww7","openshift-multus/multus-4zcj9","openshift-multus/multus-additional-cni-plugins-5kzp2","openshift-network-operator/iptables-alerter-t74vl","kube-system/konnectivity-agent-lvtf4","openshift-cluster-node-tuning-operator/tuned-tcns5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal","openshift-multus/network-metrics-daemon-4kdjq","openshift-network-diagnostics/network-check-target-jfnzx","openshift-ovn-kubernetes/ovnkube-node-f5brp","kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr"] Apr 17 14:34:08.589512 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.589499 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.591607 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.591586 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.591716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.591589 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.591716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.591700 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2cbqb\"" Apr 17 14:34:08.591820 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.591700 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.594076 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.594043 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:34:08.594502 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.594485 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.595767 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.594776 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-njt8v\"" Apr 17 14:34:08.595767 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.595037 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.596614 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.596591 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.598705 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.598687 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:34:08.598899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.598882 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.598971 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.598932 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:34:08.598971 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.598957 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.599102 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.598975 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8f7nj\"" Apr 17 14:34:08.599102 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.598882 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.599360 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.599344 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.601059 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601040 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:34:08.601115 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601058 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jxqxc\"" Apr 17 14:34:08.601302 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601290 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:34:08.601340 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601316 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.601906 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601892 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.601980 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601893 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:34:08.601980 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.601893 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jrs7q\"" Apr 17 14:34:08.602920 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.602904 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:34:08.603287 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.603272 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.603384 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.603368 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.605310 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605292 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:34:08.605566 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605546 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.605660 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605649 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hthmd\"" Apr 17 14:34:08.605725 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605663 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.605725 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605666 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:08.605849 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.605824 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:08.605948 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605933 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tgpzr\"" Apr 17 14:34:08.606007 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.605955 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:34:08.608039 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608018 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:08.608114 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.608082 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:08.608210 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608193 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-system-cni-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608248 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608231 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-netns\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608291 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608264 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-cni-multus\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608334 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608293 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-lib-modules\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.608334 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608313 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.608334 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608330 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/62d02888-cea1-4f15-b042-fb651835bf6a-agent-certs\") pod \"konnectivity-agent-lvtf4\" (UID: \"62d02888-cea1-4f15-b042-fb651835bf6a\") " pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.608459 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608348 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-k8s-cni-cncf-io\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608459 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608374 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-hostroot\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608459 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608391 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-conf-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608459 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608429 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-etc-kubernetes\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608466 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-modprobe-d\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608491 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-host\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608517 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vql8w\" (UniqueName: \"kubernetes.io/projected/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-kube-api-access-vql8w\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608560 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-serviceca\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608585 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-multus-certs\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608608 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-sys\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.608644 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608631 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/044e9f1a-a8ec-4b10-8647-92f9ec016842-tmp\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608656 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt2m9\" (UniqueName: \"kubernetes.io/projected/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-kube-api-access-kt2m9\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608680 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cnibin\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608703 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/62d02888-cea1-4f15-b042-fb651835bf6a-konnectivity-ca\") pod \"konnectivity-agent-lvtf4\" (UID: \"62d02888-cea1-4f15-b042-fb651835bf6a\") " pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608729 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/542f49ba-8bb4-4178-9c98-a94bc1f60de1-hosts-file\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608752 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-cni-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608790 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-cnibin\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608832 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-os-release\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608854 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f84353d-e913-4a0e-94b9-1138b03b1814-cni-binary-copy\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608875 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-kubelet\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608909 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-systemd\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608941 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/542f49ba-8bb4-4178-9c98-a94bc1f60de1-tmp-dir\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608971 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e46a81b9-6866-4354-99c3-22badfdac979-host-slash\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.609017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.608996 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609027 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609051 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609068 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-daemon-config\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609083 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fjdf\" (UniqueName: \"kubernetes.io/projected/7f84353d-e913-4a0e-94b9-1138b03b1814-kube-api-access-8fjdf\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609100 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-kubernetes\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609128 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysctl-conf\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609170 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-os-release\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609198 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727wf\" (UniqueName: \"kubernetes.io/projected/175f6a59-d17b-42f0-b454-ff9a315c3d7a-kube-api-access-727wf\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609227 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqmbs\" (UniqueName: \"kubernetes.io/projected/542f49ba-8bb4-4178-9c98-a94bc1f60de1-kube-api-access-pqmbs\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609250 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-socket-dir-parent\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609291 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-cni-bin\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609317 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e46a81b9-6866-4354-99c3-22badfdac979-iptables-alerter-script\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609350 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbd96\" (UniqueName: \"kubernetes.io/projected/e46a81b9-6866-4354-99c3-22badfdac979-kube-api-access-zbd96\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609372 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vtvf\" (UniqueName: \"kubernetes.io/projected/044e9f1a-a8ec-4b10-8647-92f9ec016842-kube-api-access-9vtvf\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609391 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-system-cni-dir\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.609533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609408 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysconfig\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.610440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609448 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysctl-d\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.610440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609473 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-run\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.610440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609512 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-var-lib-kubelet\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.610440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609538 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-tuned\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.610440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609561 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:08.610440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.609584 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-host\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.610720 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.610569 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.612520 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.612503 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.612678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.612661 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.612837 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.612822 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:34:08.613102 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.613085 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:34:08.613159 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.613105 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:34:08.613191 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.613110 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:34:08.613191 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.613091 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.613474 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.613462 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-g6m69\"" Apr 17 14:34:08.614793 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.614777 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:34:08.615002 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.614986 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:34:08.615090 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.615015 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:34:08.615090 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.615053 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vbldb\"" Apr 17 14:34:08.617596 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.617580 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:34:08.632689 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.632666 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6pzgv" Apr 17 14:34:08.639913 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.639880 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6pzgv" Apr 17 14:34:08.704522 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.704501 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:34:08.709748 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709729 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-socket-dir-parent\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.709857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709757 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e46a81b9-6866-4354-99c3-22badfdac979-iptables-alerter-script\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.709857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709784 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-node-log\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.709857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709845 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysconfig\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709857 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-socket-dir-parent\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709871 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysctl-d\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709893 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709925 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-systemd\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709949 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.709970 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysconfig\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710011 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysctl-d\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.710035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710029 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/62d02888-cea1-4f15-b042-fb651835bf6a-agent-certs\") pod \"konnectivity-agent-lvtf4\" (UID: \"62d02888-cea1-4f15-b042-fb651835bf6a\") " pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.710058 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710073 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-cni-netd\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710104 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-etc-kubernetes\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710129 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-host\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.710165 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:34:09.210124516 +0000 UTC m=+2.077510166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710180 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-host\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710200 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vql8w\" (UniqueName: \"kubernetes.io/projected/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-kube-api-access-vql8w\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710221 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-etc-kubernetes\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710227 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cnibin\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710253 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/62d02888-cea1-4f15-b042-fb651835bf6a-konnectivity-ca\") pod \"konnectivity-agent-lvtf4\" (UID: \"62d02888-cea1-4f15-b042-fb651835bf6a\") " pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710281 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710320 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cnibin\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710325 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-ovnkube-config\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710359 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-multus-certs\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.710388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710392 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710395 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-multus-certs\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710444 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-etc-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710464 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-cni-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710479 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-cnibin\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710499 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-kubelet\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710526 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-systemd\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710539 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-cni-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710545 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-cnibin\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710551 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-slash\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710576 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e46a81b9-6866-4354-99c3-22badfdac979-host-slash\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710552 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-kubelet\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710593 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710593 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-systemd\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710627 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e46a81b9-6866-4354-99c3-22badfdac979-host-slash\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710625 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710669 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-sys-fs\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.711235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710690 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710744 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjdf\" (UniqueName: \"kubernetes.io/projected/7f84353d-e913-4a0e-94b9-1138b03b1814-kube-api-access-8fjdf\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710762 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-kubernetes\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710785 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-os-release\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710869 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-os-release\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710899 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e46a81b9-6866-4354-99c3-22badfdac979-iptables-alerter-script\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710915 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-kubernetes\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710943 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-727wf\" (UniqueName: \"kubernetes.io/projected/175f6a59-d17b-42f0-b454-ff9a315c3d7a-kube-api-access-727wf\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710898 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/62d02888-cea1-4f15-b042-fb651835bf6a-konnectivity-ca\") pod \"konnectivity-agent-lvtf4\" (UID: \"62d02888-cea1-4f15-b042-fb651835bf6a\") " pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710969 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzq8x\" (UniqueName: \"kubernetes.io/projected/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-kube-api-access-lzq8x\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.710999 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-run-ovn-kubernetes\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711028 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqmbs\" (UniqueName: \"kubernetes.io/projected/542f49ba-8bb4-4178-9c98-a94bc1f60de1-kube-api-access-pqmbs\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711051 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-cni-bin\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711072 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbd96\" (UniqueName: \"kubernetes.io/projected/e46a81b9-6866-4354-99c3-22badfdac979-kube-api-access-zbd96\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711091 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vtvf\" (UniqueName: \"kubernetes.io/projected/044e9f1a-a8ec-4b10-8647-92f9ec016842-kube-api-access-9vtvf\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711119 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-system-cni-dir\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711123 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-cni-bin\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.712163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711120 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711174 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-system-cni-dir\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711229 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711255 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711278 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-run-netns\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711303 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-run\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711324 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-var-lib-kubelet\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711348 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-tuned\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711355 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/175f6a59-d17b-42f0-b454-ff9a315c3d7a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711372 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-host\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711396 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711401 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-var-lib-kubelet\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711374 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-run\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711422 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-device-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711459 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-systemd-units\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711473 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-host\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711491 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-system-cni-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711519 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-netns\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-cni-multus\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711570 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-lib-modules\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711596 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-ovn\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711622 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-env-overrides\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711632 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-system-cni-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711649 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-k8s-cni-cncf-io\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711678 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-hostroot\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711687 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-var-lib-cni-multus\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711703 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-conf-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711598 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-netns\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711763 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-hostroot\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711750 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-host-run-k8s-cni-cncf-io\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711770 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-lib-modules\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711794 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-modprobe-d\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711822 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-conf-dir\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711840 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-serviceca\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711855 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.713600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711869 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-registration-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711880 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-modprobe-d\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711894 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-cni-bin\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.711997 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcccx\" (UniqueName: \"kubernetes.io/projected/f219cfa3-6451-4834-b849-5d264acf8bac-kube-api-access-jcccx\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712167 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-sys\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712194 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/044e9f1a-a8ec-4b10-8647-92f9ec016842-tmp\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712218 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt2m9\" (UniqueName: \"kubernetes.io/projected/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-kube-api-access-kt2m9\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712205 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-serviceca\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712248 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/542f49ba-8bb4-4178-9c98-a94bc1f60de1-hosts-file\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-os-release\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712250 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-sys\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712282 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/542f49ba-8bb4-4178-9c98-a94bc1f60de1-hosts-file\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712301 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f84353d-e913-4a0e-94b9-1138b03b1814-cni-binary-copy\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712326 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712332 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f84353d-e913-4a0e-94b9-1138b03b1814-os-release\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712355 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-ovnkube-script-lib\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712398 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/542f49ba-8bb4-4178-9c98-a94bc1f60de1-tmp-dir\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712444 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-socket-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.714124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712469 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f219cfa3-6451-4834-b849-5d264acf8bac-ovn-node-metrics-cert\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712498 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-daemon-config\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysctl-conf\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712595 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-kubelet\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712619 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-var-lib-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712645 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-log-socket\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712890 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f84353d-e913-4a0e-94b9-1138b03b1814-cni-binary-copy\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712925 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/542f49ba-8bb4-4178-9c98-a94bc1f60de1-tmp-dir\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.712949 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/175f6a59-d17b-42f0-b454-ff9a315c3d7a-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.713101 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-sysctl-conf\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.713284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f84353d-e913-4a0e-94b9-1138b03b1814-multus-daemon-config\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.713941 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/044e9f1a-a8ec-4b10-8647-92f9ec016842-etc-tuned\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.714251 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/62d02888-cea1-4f15-b042-fb651835bf6a-agent-certs\") pod \"konnectivity-agent-lvtf4\" (UID: \"62d02888-cea1-4f15-b042-fb651835bf6a\") " pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.714623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.714392 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/044e9f1a-a8ec-4b10-8647-92f9ec016842-tmp\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.718581 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.718499 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vql8w\" (UniqueName: \"kubernetes.io/projected/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-kube-api-access-vql8w\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:08.720175 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.720148 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-727wf\" (UniqueName: \"kubernetes.io/projected/175f6a59-d17b-42f0-b454-ff9a315c3d7a-kube-api-access-727wf\") pod \"multus-additional-cni-plugins-5kzp2\" (UID: \"175f6a59-d17b-42f0-b454-ff9a315c3d7a\") " pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.720447 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.720426 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbd96\" (UniqueName: \"kubernetes.io/projected/e46a81b9-6866-4354-99c3-22badfdac979-kube-api-access-zbd96\") pod \"iptables-alerter-t74vl\" (UID: \"e46a81b9-6866-4354-99c3-22badfdac979\") " pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.720956 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.720930 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqmbs\" (UniqueName: \"kubernetes.io/projected/542f49ba-8bb4-4178-9c98-a94bc1f60de1-kube-api-access-pqmbs\") pod \"node-resolver-m5qlh\" (UID: \"542f49ba-8bb4-4178-9c98-a94bc1f60de1\") " pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.721019 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.720961 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt2m9\" (UniqueName: \"kubernetes.io/projected/82ff5ce9-528b-4d19-9a09-ac7e64ef9d46-kube-api-access-kt2m9\") pod \"node-ca-rhww7\" (UID: \"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46\") " pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.721062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.721022 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vtvf\" (UniqueName: \"kubernetes.io/projected/044e9f1a-a8ec-4b10-8647-92f9ec016842-kube-api-access-9vtvf\") pod \"tuned-tcns5\" (UID: \"044e9f1a-a8ec-4b10-8647-92f9ec016842\") " pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:08.721391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.721373 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fjdf\" (UniqueName: \"kubernetes.io/projected/7f84353d-e913-4a0e-94b9-1138b03b1814-kube-api-access-8fjdf\") pod \"multus-4zcj9\" (UID: \"7f84353d-e913-4a0e-94b9-1138b03b1814\") " pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.813101 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:08.813101 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813067 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-ovnkube-config\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813101 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813083 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-etc-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813101 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813100 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-slash\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813118 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813142 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-sys-fs\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813168 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-etc-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813168 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813208 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813227 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-sys-fs\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813175 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-slash\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813240 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813240 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzq8x\" (UniqueName: \"kubernetes.io/projected/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-kube-api-access-lzq8x\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813276 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-run-ovn-kubernetes\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813311 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813331 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-run-netns\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813358 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-device-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813361 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-run-ovn-kubernetes\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813378 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-systemd-units\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813404 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-systemd-units\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.813455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813407 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-run-netns\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813431 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813435 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-ovn\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813463 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-ovn\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813451 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-device-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813475 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-env-overrides\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813497 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-registration-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813517 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-cni-bin\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcccx\" (UniqueName: \"kubernetes.io/projected/f219cfa3-6451-4834-b849-5d264acf8bac-kube-api-access-jcccx\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813562 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-ovnkube-script-lib\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813582 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-registration-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813598 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-cni-bin\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813586 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-socket-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813678 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f219cfa3-6451-4834-b849-5d264acf8bac-ovn-node-metrics-cert\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813712 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-kubelet\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813737 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-var-lib-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813680 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-socket-dir\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.814253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813780 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-log-socket\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813838 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-log-socket\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813843 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-node-log\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813785 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-kubelet\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813883 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-node-log\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813787 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-var-lib-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813962 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-systemd\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813998 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-systemd\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.813997 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.814042 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-cni-netd\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.814095 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-run-openvswitch\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.814112 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f219cfa3-6451-4834-b849-5d264acf8bac-host-cni-netd\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.814139 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-env-overrides\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.814152 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-ovnkube-config\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.814773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.814328 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f219cfa3-6451-4834-b849-5d264acf8bac-ovnkube-script-lib\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.816048 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.816028 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f219cfa3-6451-4834-b849-5d264acf8bac-ovn-node-metrics-cert\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.818760 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.818741 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:08.818760 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.818761 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:08.818897 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.818770 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:08.818897 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:08.818879 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:09.318839895 +0000 UTC m=+2.186225545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:08.821030 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.821008 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzq8x\" (UniqueName: \"kubernetes.io/projected/b7c909cf-d472-4d25-8c6a-3ae53e4931b5-kube-api-access-lzq8x\") pod \"aws-ebs-csi-driver-node-nxxzr\" (UID: \"b7c909cf-d472-4d25-8c6a-3ae53e4931b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:08.821820 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.821782 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcccx\" (UniqueName: \"kubernetes.io/projected/f219cfa3-6451-4834-b849-5d264acf8bac-kube-api-access-jcccx\") pod \"ovnkube-node-f5brp\" (UID: \"f219cfa3-6451-4834-b849-5d264acf8bac\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:08.862707 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.862470 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342031fe6556563fbef6e8d55c3a781f.slice/crio-36a9f4c25bdd4358663291c229801e9cdfd697fd359864746a0a81c63f3d277e WatchSource:0}: Error finding container 36a9f4c25bdd4358663291c229801e9cdfd697fd359864746a0a81c63f3d277e: Status 404 returned error can't find the container with id 36a9f4c25bdd4358663291c229801e9cdfd697fd359864746a0a81c63f3d277e Apr 17 14:34:08.863533 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.863490 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff484d21f6659085ac06da34cbc27dec.slice/crio-0a62cdf367d669f0047918bd72531c482ea22c5c29b66dc1e1b22b81b6480fc9 WatchSource:0}: Error finding container 0a62cdf367d669f0047918bd72531c482ea22c5c29b66dc1e1b22b81b6480fc9: Status 404 returned error can't find the container with id 0a62cdf367d669f0047918bd72531c482ea22c5c29b66dc1e1b22b81b6480fc9 Apr 17 14:34:08.868329 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.868308 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:34:08.921887 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.921849 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m5qlh" Apr 17 14:34:08.928179 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.928155 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542f49ba_8bb4_4178_9c98_a94bc1f60de1.slice/crio-ecaac71e9fa2a1fd1e6c869dab693493c93ee0165a83de268f597e35831a1a68 WatchSource:0}: Error finding container ecaac71e9fa2a1fd1e6c869dab693493c93ee0165a83de268f597e35831a1a68: Status 404 returned error can't find the container with id ecaac71e9fa2a1fd1e6c869dab693493c93ee0165a83de268f597e35831a1a68 Apr 17 14:34:08.932148 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.932128 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rhww7" Apr 17 14:34:08.938330 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.938306 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ff5ce9_528b_4d19_9a09_ac7e64ef9d46.slice/crio-7feb3c6dd679054a2346afdff53d7074fc4b5f7ead0f3a01bf74df1de7765e1a WatchSource:0}: Error finding container 7feb3c6dd679054a2346afdff53d7074fc4b5f7ead0f3a01bf74df1de7765e1a: Status 404 returned error can't find the container with id 7feb3c6dd679054a2346afdff53d7074fc4b5f7ead0f3a01bf74df1de7765e1a Apr 17 14:34:08.945385 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.945363 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zcj9" Apr 17 14:34:08.950935 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.950913 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t74vl" Apr 17 14:34:08.951665 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.951645 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f84353d_e913_4a0e_94b9_1138b03b1814.slice/crio-fe772f6345a7c003a689e73504c1f2cc6f0af92849bffa610acbeb119e793511 WatchSource:0}: Error finding container fe772f6345a7c003a689e73504c1f2cc6f0af92849bffa610acbeb119e793511: Status 404 returned error can't find the container with id fe772f6345a7c003a689e73504c1f2cc6f0af92849bffa610acbeb119e793511 Apr 17 14:34:08.957769 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.957745 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46a81b9_6866_4354_99c3_22badfdac979.slice/crio-943abd69c2038eeeb644b96675e60ba5c68ae3b689a4ad0993d5ac8a1bd183c3 WatchSource:0}: Error finding container 943abd69c2038eeeb644b96675e60ba5c68ae3b689a4ad0993d5ac8a1bd183c3: Status 404 returned error can't find the container with id 943abd69c2038eeeb644b96675e60ba5c68ae3b689a4ad0993d5ac8a1bd183c3 Apr 17 14:34:08.967230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.967209 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" Apr 17 14:34:08.973307 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.973284 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175f6a59_d17b_42f0_b454_ff9a315c3d7a.slice/crio-c50ec432948f370fadbb86eb370d2591cdcba421e01ecce8d5806932f6f3f11d WatchSource:0}: Error finding container c50ec432948f370fadbb86eb370d2591cdcba421e01ecce8d5806932f6f3f11d: Status 404 returned error can't find the container with id c50ec432948f370fadbb86eb370d2591cdcba421e01ecce8d5806932f6f3f11d Apr 17 14:34:08.982061 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:08.982033 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:08.990093 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:08.990063 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d02888_cea1_4f15_b042_fb651835bf6a.slice/crio-137a656d607228ba43f2479c3e5320f4704bf8c1c337f06446e267056f65d7e9 WatchSource:0}: Error finding container 137a656d607228ba43f2479c3e5320f4704bf8c1c337f06446e267056f65d7e9: Status 404 returned error can't find the container with id 137a656d607228ba43f2479c3e5320f4704bf8c1c337f06446e267056f65d7e9 Apr 17 14:34:09.006708 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.006683 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tcns5" Apr 17 14:34:09.012528 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:09.012498 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod044e9f1a_a8ec_4b10_8647_92f9ec016842.slice/crio-bc2f8b3fdc146cdce1ad2d5fbb704a4f20bdf34716d90ae0005e990769655ba3 WatchSource:0}: Error finding container bc2f8b3fdc146cdce1ad2d5fbb704a4f20bdf34716d90ae0005e990769655ba3: Status 404 returned error can't find the container with id bc2f8b3fdc146cdce1ad2d5fbb704a4f20bdf34716d90ae0005e990769655ba3 Apr 17 14:34:09.031777 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.031751 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:09.037487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.037457 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" Apr 17 14:34:09.038132 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:09.038044 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf219cfa3_6451_4834_b849_5d264acf8bac.slice/crio-3b92591dfb49ad1b13d789c6d0c5eef36aa575ffdf84fc24499ec5bf2839c4c3 WatchSource:0}: Error finding container 3b92591dfb49ad1b13d789c6d0c5eef36aa575ffdf84fc24499ec5bf2839c4c3: Status 404 returned error can't find the container with id 3b92591dfb49ad1b13d789c6d0c5eef36aa575ffdf84fc24499ec5bf2839c4c3 Apr 17 14:34:09.043611 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:34:09.043578 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c909cf_d472_4d25_8c6a_3ae53e4931b5.slice/crio-53bcc1d8586f5ff68acb535c38f0e3e2bd461627e7b412af108205a209dbbdb5 WatchSource:0}: Error finding container 53bcc1d8586f5ff68acb535c38f0e3e2bd461627e7b412af108205a209dbbdb5: Status 404 returned error can't find the container with id 53bcc1d8586f5ff68acb535c38f0e3e2bd461627e7b412af108205a209dbbdb5 Apr 17 14:34:09.045246 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.045209 2582 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-driver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:abb4d487159fbc9b7148c690cd3a6ee638680f8f879ff6195ca1be5b393705b0,Command:[],Args:[node --endpoint=$(CSI_ENDPOINT) --logtostderr --v=2 --reserved-volume-attachments=1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:healthz,HostPort:10300,ContainerPort:10300,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CSI_ENDPOINT,Value:unix:/csi/csi.sock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:device-dir,ReadOnly:false,MountPath:/dev,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-selinux,ReadOnly:false,MountPath:/etc/selinux,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sys-fs,ReadOnly:false,MountPath:/sys/fs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzq8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{1 0 healthz},Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:3,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod aws-ebs-csi-driver-node-nxxzr_openshift-cluster-csi-drivers(b7c909cf-d472-4d25-8c6a-3ae53e4931b5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 17 14:34:09.047101 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.047060 2582 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49ae6b645f147135aff6d0f464fcd64972abf57733f027318ca7376691eece01,Command:[],Args:[--csi-address=/csi/csi.sock --kubelet-registration-path=/var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock --http-endpoint=127.0.0.1:10309 --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:rhealthz,HostPort:10309,ContainerPort:10309,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzq8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{1 0 rhealthz},Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:3,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/bin/sh -c rm -rf /registration/ebs.csi.aws.com-reg.sock /csi/csi.sock],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},StopSignal:nil,},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod aws-ebs-csi-driver-node-nxxzr_openshift-cluster-csi-drivers(b7c909cf-d472-4d25-8c6a-3ae53e4931b5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 17 14:34:09.048846 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.048795 2582 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-liveness-probe,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:232019e2eb8e13139570277b223ff822086fa83edc73958cbf919d6b57118068,Command:[],Args:[--csi-address=/csi/csi.sock --http-endpoint=127.0.0.1:10300 --v=2 --probe-timeout=3s],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzq8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod aws-ebs-csi-driver-node-nxxzr_openshift-cluster-csi-drivers(b7c909cf-d472-4d25-8c6a-3ae53e4931b5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 17 14:34:09.049970 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.049946 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"csi-driver\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"csi-liveness-probe\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" podUID="b7c909cf-d472-4d25-8c6a-3ae53e4931b5" Apr 17 14:34:09.217896 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.217776 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:09.218020 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.217923 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:09.218020 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.217983 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:34:10.217969319 +0000 UTC m=+3.085354961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:09.319653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.319175 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:09.319653 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.319299 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:09.319653 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.319315 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:09.319653 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.319323 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:09.319653 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.319365 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:10.31935088 +0000 UTC m=+3.186736509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:09.462495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.462463 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:09.477574 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.477502 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:09.642996 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.642756 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:29:08 +0000 UTC" deadline="2027-12-11 20:25:58.204675672 +0000 UTC" Apr 17 14:34:09.642996 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.642818 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14477h51m48.561882524s" Apr 17 14:34:09.771099 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.770986 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"3b92591dfb49ad1b13d789c6d0c5eef36aa575ffdf84fc24499ec5bf2839c4c3"} Apr 17 14:34:09.773883 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.773853 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tcns5" event={"ID":"044e9f1a-a8ec-4b10-8647-92f9ec016842","Type":"ContainerStarted","Data":"bc2f8b3fdc146cdce1ad2d5fbb704a4f20bdf34716d90ae0005e990769655ba3"} Apr 17 14:34:09.785174 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.785141 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerStarted","Data":"c50ec432948f370fadbb86eb370d2591cdcba421e01ecce8d5806932f6f3f11d"} Apr 17 14:34:09.788670 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.788627 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t74vl" event={"ID":"e46a81b9-6866-4354-99c3-22badfdac979","Type":"ContainerStarted","Data":"943abd69c2038eeeb644b96675e60ba5c68ae3b689a4ad0993d5ac8a1bd183c3"} Apr 17 14:34:09.793225 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.793198 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m5qlh" event={"ID":"542f49ba-8bb4-4178-9c98-a94bc1f60de1","Type":"ContainerStarted","Data":"ecaac71e9fa2a1fd1e6c869dab693493c93ee0165a83de268f597e35831a1a68"} Apr 17 14:34:09.813720 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.813678 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" event={"ID":"b7c909cf-d472-4d25-8c6a-3ae53e4931b5","Type":"ContainerStarted","Data":"53bcc1d8586f5ff68acb535c38f0e3e2bd461627e7b412af108205a209dbbdb5"} Apr 17 14:34:09.816161 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.816115 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lvtf4" event={"ID":"62d02888-cea1-4f15-b042-fb651835bf6a","Type":"ContainerStarted","Data":"137a656d607228ba43f2479c3e5320f4704bf8c1c337f06446e267056f65d7e9"} Apr 17 14:34:09.820364 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.820340 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zcj9" event={"ID":"7f84353d-e913-4a0e-94b9-1138b03b1814","Type":"ContainerStarted","Data":"fe772f6345a7c003a689e73504c1f2cc6f0af92849bffa610acbeb119e793511"} Apr 17 14:34:09.820750 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:09.820715 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"csi-driver\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:abb4d487159fbc9b7148c690cd3a6ee638680f8f879ff6195ca1be5b393705b0\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49ae6b645f147135aff6d0f464fcd64972abf57733f027318ca7376691eece01\\\": ErrImagePull: pull QPS exceeded\", failed to \"StartContainer\" for \"csi-liveness-probe\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:232019e2eb8e13139570277b223ff822086fa83edc73958cbf919d6b57118068\\\": ErrImagePull: pull QPS exceeded\"]" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" podUID="b7c909cf-d472-4d25-8c6a-3ae53e4931b5" Apr 17 14:34:09.823776 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.823753 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rhww7" event={"ID":"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46","Type":"ContainerStarted","Data":"7feb3c6dd679054a2346afdff53d7074fc4b5f7ead0f3a01bf74df1de7765e1a"} Apr 17 14:34:09.831258 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.828685 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" event={"ID":"ff484d21f6659085ac06da34cbc27dec","Type":"ContainerStarted","Data":"0a62cdf367d669f0047918bd72531c482ea22c5c29b66dc1e1b22b81b6480fc9"} Apr 17 14:34:09.832353 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:09.832316 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" event={"ID":"342031fe6556563fbef6e8d55c3a781f","Type":"ContainerStarted","Data":"36a9f4c25bdd4358663291c229801e9cdfd697fd359864746a0a81c63f3d277e"} Apr 17 14:34:10.086349 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.086245 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:34:10.228247 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.228204 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:10.228442 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.228401 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:10.228506 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.228465 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:34:12.228446175 +0000 UTC m=+5.095831821 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:10.329367 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.329328 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:10.329565 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.329520 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:10.329565 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.329539 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:10.329565 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.329551 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:10.329731 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.329607 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:12.329590567 +0000 UTC m=+5.196976213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:10.643510 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.643443 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:29:08 +0000 UTC" deadline="2027-11-19 11:21:37.792973844 +0000 UTC" Apr 17 14:34:10.643510 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.643488 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13940h47m27.149489364s" Apr 17 14:34:10.738874 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.738833 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:10.739056 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.738977 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:10.739472 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:10.739444 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:10.739583 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:10.739563 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:12.246621 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:12.246587 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:12.247103 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.246740 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:12.247103 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.246831 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:34:16.246795398 +0000 UTC m=+9.114181042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:12.348115 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:12.347540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:12.348115 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.347686 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:12.348115 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.347701 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:12.348115 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.347712 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:12.348115 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.347764 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:16.347750332 +0000 UTC m=+9.215135961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:12.738693 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:12.738611 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:12.738882 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.738745 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:12.739177 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:12.739158 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:12.739303 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:12.739267 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:14.738600 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:14.738366 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:14.739169 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:14.738389 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:14.739169 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:14.738704 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:14.739169 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:14.738762 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:16.278689 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:16.278650 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:16.279143 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.278845 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:16.279143 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.278910 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:34:24.27889089 +0000 UTC m=+17.146276542 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:16.379619 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:16.379579 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:16.379845 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.379729 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:16.379845 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.379751 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:16.379845 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.379764 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:16.379970 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.379859 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:24.37984363 +0000 UTC m=+17.247229259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:16.739136 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:16.738592 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:16.739136 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.738723 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:16.739136 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:16.738592 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:16.739136 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:16.738932 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:18.738577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:18.738534 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:18.739017 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:18.738659 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:18.739017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:18.738724 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:18.739017 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:18.738869 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:20.738750 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:20.738717 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:20.739207 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:20.738718 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:20.739207 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:20.738881 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:20.739207 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:20.738953 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:22.739041 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:22.739001 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:22.739041 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:22.739033 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:22.739483 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:22.739123 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:22.739483 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:22.739281 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:24.336044 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:24.336004 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:24.336553 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.336159 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:24.336553 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.336227 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:34:40.336210937 +0000 UTC m=+33.203596590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:24.436663 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:24.436615 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:24.436957 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.436853 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:24.436957 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.436878 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:24.436957 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.436888 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:24.436957 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.436946 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:40.436929148 +0000 UTC m=+33.304314800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:24.738341 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:24.738251 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:24.738341 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:24.738284 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:24.738543 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.738451 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:24.738889 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:24.738859 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:26.738201 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:26.738170 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:26.738624 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:26.738180 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:26.738624 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:26.738264 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:26.738624 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:26.738350 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:27.872196 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.870644 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" event={"ID":"342031fe6556563fbef6e8d55c3a781f","Type":"ContainerStarted","Data":"413695f538fd3f8b47973185a74dadd4d11aa1e2584db4afa1a7716379c433bd"} Apr 17 14:34:27.874212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.873983 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"bcb89c080d6ea3544cdeedb56d72fb7496e9a9fbf7082e9fe962222832c979f4"} Apr 17 14:34:27.874212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.874020 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"746dbe6a023b0104159f3c17a000324d59f146ccdfd74ba9509ae0eb90a389ed"} Apr 17 14:34:27.874212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.874034 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"68db7bc2c3a30616bc6df809b94c0e3bcbcff61e93a993aa4d65f5cff049d5b0"} Apr 17 14:34:27.874212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.874045 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"91561d560bdda677d4ba1cc61b05f8d51c68368eebdcb1f842624052ce7650c6"} Apr 17 14:34:27.874212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.874067 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"ae530f786a19f74b7b2c908084e029acd862d7fa5c1ec478f25a12a148be216d"} Apr 17 14:34:27.874212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.874079 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"e9d6ad8e2d5d6ff3ed8b9a569c45b804141fdd2aba8d8127608361ec55e1943d"} Apr 17 14:34:27.876064 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.875951 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tcns5" event={"ID":"044e9f1a-a8ec-4b10-8647-92f9ec016842","Type":"ContainerStarted","Data":"539c8936f696afc8636f751f6da955fbe718d375a1223fd1e85efa4c2d7fd9cc"} Apr 17 14:34:27.877492 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.877324 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zcj9" event={"ID":"7f84353d-e913-4a0e-94b9-1138b03b1814","Type":"ContainerStarted","Data":"edb91b895f31feeece3a0457990fede2178585bc07e07ffdcfba165ade3c9362"} Apr 17 14:34:27.884171 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.883993 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-171.ec2.internal" podStartSLOduration=20.883980574 podStartE2EDuration="20.883980574s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:34:27.883397374 +0000 UTC m=+20.750783026" watchObservedRunningTime="2026-04-17 14:34:27.883980574 +0000 UTC m=+20.751366224" Apr 17 14:34:27.898445 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:27.898359 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4zcj9" podStartSLOduration=2.990390128 podStartE2EDuration="20.898340859s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:08.954070651 +0000 UTC m=+1.821456280" lastFinishedPulling="2026-04-17 14:34:26.862021366 +0000 UTC m=+19.729407011" observedRunningTime="2026-04-17 14:34:27.897999747 +0000 UTC m=+20.765385400" watchObservedRunningTime="2026-04-17 14:34:27.898340859 +0000 UTC m=+20.765726510" Apr 17 14:34:28.739102 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.738857 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:28.739238 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.738891 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:28.739238 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:28.739130 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:28.739238 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:28.739191 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:28.879665 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.879631 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" event={"ID":"b7c909cf-d472-4d25-8c6a-3ae53e4931b5","Type":"ContainerStarted","Data":"b2b875b869bb57a1ce52caebcd1fbc4f6808742e7659aa65a31f14326715f6b6"} Apr 17 14:34:28.880950 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.880920 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lvtf4" event={"ID":"62d02888-cea1-4f15-b042-fb651835bf6a","Type":"ContainerStarted","Data":"9d04fa5b452c18cd67cb9bea6979bd89c88c0bfcef39b8eb08d2e5bf8ec486d4"} Apr 17 14:34:28.882180 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.882155 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rhww7" event={"ID":"82ff5ce9-528b-4d19-9a09-ac7e64ef9d46","Type":"ContainerStarted","Data":"3a621189e5c05298fbcfecd581280a1a2a730df08ec03b67b5dbe4da1e457a6e"} Apr 17 14:34:28.883432 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.883412 2582 generic.go:358] "Generic (PLEG): container finished" podID="ff484d21f6659085ac06da34cbc27dec" containerID="e9b960209096d24ebf3f113167ef22694399334d02850e30113dd69fa227117e" exitCode=0 Apr 17 14:34:28.883509 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.883479 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" event={"ID":"ff484d21f6659085ac06da34cbc27dec","Type":"ContainerDied","Data":"e9b960209096d24ebf3f113167ef22694399334d02850e30113dd69fa227117e"} Apr 17 14:34:28.884928 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.884900 2582 generic.go:358] "Generic (PLEG): container finished" podID="175f6a59-d17b-42f0-b454-ff9a315c3d7a" containerID="270115adc75d69db9c81ae6ba29600d456192c3d8e78a5f4a2bf153575fbb22b" exitCode=0 Apr 17 14:34:28.885141 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.884957 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerDied","Data":"270115adc75d69db9c81ae6ba29600d456192c3d8e78a5f4a2bf153575fbb22b"} Apr 17 14:34:28.888478 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.888443 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t74vl" event={"ID":"e46a81b9-6866-4354-99c3-22badfdac979","Type":"ContainerStarted","Data":"4c1bd0cd7a1e978acdb731e39c54c65ff4bba71ec5d4edfdf1d0453946142c06"} Apr 17 14:34:28.889910 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.889871 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m5qlh" event={"ID":"542f49ba-8bb4-4178-9c98-a94bc1f60de1","Type":"ContainerStarted","Data":"6fb1dd6d75c88a24af8c1dfce62870d2b7471d8b1d4d2177ebd8b02ee396d472"} Apr 17 14:34:28.894864 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.894829 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lvtf4" podStartSLOduration=4.058070428 podStartE2EDuration="21.894797423s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:08.991491375 +0000 UTC m=+1.858877003" lastFinishedPulling="2026-04-17 14:34:26.828218368 +0000 UTC m=+19.695603998" observedRunningTime="2026-04-17 14:34:28.894564081 +0000 UTC m=+21.761949733" watchObservedRunningTime="2026-04-17 14:34:28.894797423 +0000 UTC m=+21.762183073" Apr 17 14:34:28.894953 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.894915 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tcns5" podStartSLOduration=4.080797589 podStartE2EDuration="21.894910398s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:09.014004706 +0000 UTC m=+1.881390335" lastFinishedPulling="2026-04-17 14:34:26.8281175 +0000 UTC m=+19.695503144" observedRunningTime="2026-04-17 14:34:27.915159616 +0000 UTC m=+20.782545271" watchObservedRunningTime="2026-04-17 14:34:28.894910398 +0000 UTC m=+21.762296051" Apr 17 14:34:28.907077 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.907029 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t74vl" podStartSLOduration=4.062735251 podStartE2EDuration="21.907021373s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:08.959346083 +0000 UTC m=+1.826731712" lastFinishedPulling="2026-04-17 14:34:26.803632193 +0000 UTC m=+19.671017834" observedRunningTime="2026-04-17 14:34:28.906785802 +0000 UTC m=+21.774171454" watchObservedRunningTime="2026-04-17 14:34:28.907021373 +0000 UTC m=+21.774407025" Apr 17 14:34:28.950936 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:28.950778 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m5qlh" podStartSLOduration=4.076835265 podStartE2EDuration="21.95076346s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:08.929702046 +0000 UTC m=+1.797087676" lastFinishedPulling="2026-04-17 14:34:26.803630228 +0000 UTC m=+19.671015871" observedRunningTime="2026-04-17 14:34:28.950217727 +0000 UTC m=+21.817603377" watchObservedRunningTime="2026-04-17 14:34:28.95076346 +0000 UTC m=+21.818149110" Apr 17 14:34:29.574817 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.574765 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:34:29.681653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.681504 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:34:29.574791563Z","UUID":"d090da0b-4865-4c80-81f6-2634e9d85505","Handler":null,"Name":"","Endpoint":""} Apr 17 14:34:29.684370 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.684347 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:34:29.684370 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.684377 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:34:29.893359 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.893324 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" event={"ID":"b7c909cf-d472-4d25-8c6a-3ae53e4931b5","Type":"ContainerStarted","Data":"95abaf9d9fdd121373d24ef979a9061d2235b1ea487e62940dd78054394bdafd"} Apr 17 14:34:29.895201 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.895175 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" event={"ID":"ff484d21f6659085ac06da34cbc27dec","Type":"ContainerStarted","Data":"d7dbd6b9687b156194edd9ab829a1c7f12c3e98e479990caee42ddf5d5c351b7"} Apr 17 14:34:29.898013 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.897989 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"d362d466cff88da74469358313ecaa20b6bc85e03ae2cf934d0fc8a663935f26"} Apr 17 14:34:29.910538 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.910492 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rhww7" podStartSLOduration=4.997303496 podStartE2EDuration="22.91047761s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:08.939986053 +0000 UTC m=+1.807371695" lastFinishedPulling="2026-04-17 14:34:26.853160166 +0000 UTC m=+19.720545809" observedRunningTime="2026-04-17 14:34:28.96323122 +0000 UTC m=+21.830616867" watchObservedRunningTime="2026-04-17 14:34:29.91047761 +0000 UTC m=+22.777863264" Apr 17 14:34:29.910820 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:29.910783 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-171.ec2.internal" podStartSLOduration=22.91077975 podStartE2EDuration="22.91077975s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:34:29.910290797 +0000 UTC m=+22.777676448" watchObservedRunningTime="2026-04-17 14:34:29.91077975 +0000 UTC m=+22.778165401" Apr 17 14:34:30.154760 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:30.154678 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:30.155299 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:30.155276 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:30.738540 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:30.738457 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:30.738690 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:30.738460 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:30.738690 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:30.738551 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:30.738690 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:30.738646 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:30.902670 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:30.902626 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" event={"ID":"b7c909cf-d472-4d25-8c6a-3ae53e4931b5","Type":"ContainerStarted","Data":"12fd7b3941d548a55d4c5e9187e064fa1550184058d45014ba8a84a3844f1756"} Apr 17 14:34:30.918353 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:30.918298 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nxxzr" podStartSLOduration=2.5749079249999998 podStartE2EDuration="23.918280156s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:09.045038689 +0000 UTC m=+1.912424318" lastFinishedPulling="2026-04-17 14:34:30.388410919 +0000 UTC m=+23.255796549" observedRunningTime="2026-04-17 14:34:30.918025321 +0000 UTC m=+23.785410971" watchObservedRunningTime="2026-04-17 14:34:30.918280156 +0000 UTC m=+23.785665810" Apr 17 14:34:31.904465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:31.904290 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:34:32.738893 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.738863 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:32.739031 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.738864 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:32.739031 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:32.738965 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:32.739103 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:32.739063 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:32.910260 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.910222 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" event={"ID":"f219cfa3-6451-4834-b849-5d264acf8bac","Type":"ContainerStarted","Data":"fae0e48177ec6fb1408e27989d102460b339c2994cc7327a2834fd1e64030d21"} Apr 17 14:34:32.911172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.910551 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:32.911172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.910681 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:32.911172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.910711 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:32.927612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.927584 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:32.928525 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.928504 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:34:32.937718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:32.937667 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" podStartSLOduration=7.762708516 podStartE2EDuration="25.937652351s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:09.04038216 +0000 UTC m=+1.907767793" lastFinishedPulling="2026-04-17 14:34:27.215325998 +0000 UTC m=+20.082711628" observedRunningTime="2026-04-17 14:34:32.937375671 +0000 UTC m=+25.804761325" watchObservedRunningTime="2026-04-17 14:34:32.937652351 +0000 UTC m=+25.805037997" Apr 17 14:34:33.921748 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:33.921464 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jfnzx"] Apr 17 14:34:33.922513 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:33.921840 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:33.922513 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:33.921948 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:33.922940 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:33.922917 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4kdjq"] Apr 17 14:34:33.923016 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:33.923006 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:33.923132 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:33.923116 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:35.692591 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.692549 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:35.693160 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.692682 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:34:35.693241 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.693219 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lvtf4" Apr 17 14:34:35.738197 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.738164 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:35.738343 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.738177 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:35.738343 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:35.738314 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:35.738441 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:35.738354 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:35.918725 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.918685 2582 generic.go:358] "Generic (PLEG): container finished" podID="175f6a59-d17b-42f0-b454-ff9a315c3d7a" containerID="d0091dbba70fb12110f26e1ecf8b791b0be5ed34e2c3975cfa23518bcc403f65" exitCode=0 Apr 17 14:34:35.918883 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:35.918784 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerDied","Data":"d0091dbba70fb12110f26e1ecf8b791b0be5ed34e2c3975cfa23518bcc403f65"} Apr 17 14:34:36.922670 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:36.922630 2582 generic.go:358] "Generic (PLEG): container finished" podID="175f6a59-d17b-42f0-b454-ff9a315c3d7a" containerID="f44189cacd38ef0f807abbcc71296d797986b4def964d8e6b9c61c2c8df9cd5e" exitCode=0 Apr 17 14:34:36.923034 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:36.922685 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerDied","Data":"f44189cacd38ef0f807abbcc71296d797986b4def964d8e6b9c61c2c8df9cd5e"} Apr 17 14:34:37.739106 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:37.738867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:37.739288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:37.738945 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:37.739288 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:37.739117 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:37.739288 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:37.739199 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:37.926982 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:37.926943 2582 generic.go:358] "Generic (PLEG): container finished" podID="175f6a59-d17b-42f0-b454-ff9a315c3d7a" containerID="0a01ebf6fa620849a9f9b4a385c9f51b8ec0996dd29a09aabe79a1e8045b96a5" exitCode=0 Apr 17 14:34:37.927349 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:37.926989 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerDied","Data":"0a01ebf6fa620849a9f9b4a385c9f51b8ec0996dd29a09aabe79a1e8045b96a5"} Apr 17 14:34:39.738309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:39.738272 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:39.738923 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:39.738398 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jfnzx" podUID="efde8bcb-629a-4cd7-9fe4-cea71e67b06e" Apr 17 14:34:39.738923 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:39.738271 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:39.738923 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:39.738879 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:34:39.960387 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:39.960348 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-171.ec2.internal" event="NodeReady" Apr 17 14:34:39.960585 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:39.960528 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:34:40.004395 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.004320 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kj9w5"] Apr 17 14:34:40.006924 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.006896 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-87v7h"] Apr 17 14:34:40.007065 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.007051 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.009427 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.009403 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:34:40.009427 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.009424 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:34:40.009599 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.009471 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fr4dp\"" Apr 17 14:34:40.010925 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.010907 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.012904 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.012884 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dtcwn\"" Apr 17 14:34:40.013009 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.012985 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:34:40.013126 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.013110 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:34:40.013317 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.013299 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:34:40.015704 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.015670 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kj9w5"] Apr 17 14:34:40.019553 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.019518 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-87v7h"] Apr 17 14:34:40.148181 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.148145 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fe006aa3-2754-4ceb-acc9-c8189d25053b-tmp-dir\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.148351 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.148191 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.148351 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.148216 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.148351 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.148242 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlb8d\" (UniqueName: \"kubernetes.io/projected/76590649-d620-489b-9a3c-5c78ec32d35e-kube-api-access-hlb8d\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.148351 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.148322 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7gt2\" (UniqueName: \"kubernetes.io/projected/fe006aa3-2754-4ceb-acc9-c8189d25053b-kube-api-access-v7gt2\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.148537 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.148373 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe006aa3-2754-4ceb-acc9-c8189d25053b-config-volume\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.249674 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.249637 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fe006aa3-2754-4ceb-acc9-c8189d25053b-tmp-dir\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.249899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.249686 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.249899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.249718 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.249899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.249743 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlb8d\" (UniqueName: \"kubernetes.io/projected/76590649-d620-489b-9a3c-5c78ec32d35e-kube-api-access-hlb8d\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.249899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.249779 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7gt2\" (UniqueName: \"kubernetes.io/projected/fe006aa3-2754-4ceb-acc9-c8189d25053b-kube-api-access-v7gt2\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.249899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.249832 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe006aa3-2754-4ceb-acc9-c8189d25053b-config-volume\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.250157 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.249902 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:40.250157 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.249973 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:40.749951783 +0000 UTC m=+33.617337426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:34:40.250157 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.249902 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:40.250157 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.250122 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:34:40.75010933 +0000 UTC m=+33.617494966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:34:40.250529 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.250507 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fe006aa3-2754-4ceb-acc9-c8189d25053b-tmp-dir\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.250758 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.250741 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe006aa3-2754-4ceb-acc9-c8189d25053b-config-volume\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.264347 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.264273 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7gt2\" (UniqueName: \"kubernetes.io/projected/fe006aa3-2754-4ceb-acc9-c8189d25053b-kube-api-access-v7gt2\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.264347 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.264319 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlb8d\" (UniqueName: \"kubernetes.io/projected/76590649-d620-489b-9a3c-5c78ec32d35e-kube-api-access-hlb8d\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.350667 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.350627 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:40.350850 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.350795 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:40.350907 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.350895 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:35:12.35087636 +0000 UTC m=+65.218261993 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:40.451694 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.451657 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:40.451917 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.451875 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:40.451917 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.451899 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:40.451917 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.451910 2582 projected.go:194] Error preparing data for projected volume kube-api-access-5f9qw for pod openshift-network-diagnostics/network-check-target-jfnzx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:40.452067 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.451989 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw podName:efde8bcb-629a-4cd7-9fe4-cea71e67b06e nodeName:}" failed. No retries permitted until 2026-04-17 14:35:12.451973232 +0000 UTC m=+65.319358865 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5f9qw" (UniqueName: "kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw") pod "network-check-target-jfnzx" (UID: "efde8bcb-629a-4cd7-9fe4-cea71e67b06e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:40.754786 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.754704 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:40.754786 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:40.754747 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:40.755362 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.754874 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:40.755362 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.754876 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:40.755362 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.754926 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:41.754912007 +0000 UTC m=+34.622297641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:34:40.755362 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:40.754939 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:34:41.754933256 +0000 UTC m=+34.622318885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:34:41.738765 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.738730 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:34:41.738765 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.738754 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:34:41.741388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.741355 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:34:41.742487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.742215 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:34:41.742487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.742263 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mnbtt\"" Apr 17 14:34:41.742487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.742267 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:34:41.742487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.742219 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tg92k\"" Apr 17 14:34:41.760608 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.760568 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:41.761021 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:41.760674 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:41.761021 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:41.760719 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:41.761021 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:41.760757 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:41.761021 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:41.760790 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:43.76077258 +0000 UTC m=+36.628158226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:34:41.761021 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:41.760823 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:34:43.760815403 +0000 UTC m=+36.628201032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:34:43.777723 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:43.777684 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:43.778205 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:43.777739 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:43.778205 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:43.777883 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:43.778205 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:43.777897 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:43.778205 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:43.777992 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:34:47.777969112 +0000 UTC m=+40.645354746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:34:43.778205 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:43.778015 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:47.77800366 +0000 UTC m=+40.645389291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:34:44.943821 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:44.943615 2582 generic.go:358] "Generic (PLEG): container finished" podID="175f6a59-d17b-42f0-b454-ff9a315c3d7a" containerID="2d25a68faa4dfc353874b5370ca550ab7f864867c6c74acc438c93695dcf7143" exitCode=0 Apr 17 14:34:44.943821 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:44.943688 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerDied","Data":"2d25a68faa4dfc353874b5370ca550ab7f864867c6c74acc438c93695dcf7143"} Apr 17 14:34:45.948046 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:45.948010 2582 generic.go:358] "Generic (PLEG): container finished" podID="175f6a59-d17b-42f0-b454-ff9a315c3d7a" containerID="fab5d233029d765c2ba091f2a09c0e6b9299e2cedcb96ceeacafe1749bca5aab" exitCode=0 Apr 17 14:34:45.948507 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:45.948065 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerDied","Data":"fab5d233029d765c2ba091f2a09c0e6b9299e2cedcb96ceeacafe1749bca5aab"} Apr 17 14:34:46.953000 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:46.952967 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" event={"ID":"175f6a59-d17b-42f0-b454-ff9a315c3d7a","Type":"ContainerStarted","Data":"685a4d2a6d229db56256a20c64bfc99c8f2d7f9f0f106da3efffc091b567ad2a"} Apr 17 14:34:46.974674 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:46.974627 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5kzp2" podStartSLOduration=4.89290938 podStartE2EDuration="39.974614707s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:08.97466254 +0000 UTC m=+1.842048169" lastFinishedPulling="2026-04-17 14:34:44.056367864 +0000 UTC m=+36.923753496" observedRunningTime="2026-04-17 14:34:46.973524167 +0000 UTC m=+39.840909819" watchObservedRunningTime="2026-04-17 14:34:46.974614707 +0000 UTC m=+39.842000357" Apr 17 14:34:47.807278 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:47.807239 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:47.807278 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:47.807282 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:47.807490 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:47.807407 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:47.807490 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:47.807408 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:47.807490 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:47.807472 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:55.80745627 +0000 UTC m=+48.674841907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:34:47.807490 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:47.807485 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:34:55.807479791 +0000 UTC m=+48.674865420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:34:55.863052 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:55.863009 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:34:55.863497 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:34:55.863059 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:34:55.863497 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:55.863172 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:55.863497 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:55.863173 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:55.863497 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:55.863248 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:35:11.863225293 +0000 UTC m=+64.730610922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:34:55.863497 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:34:55.863262 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:35:11.863255813 +0000 UTC m=+64.730641441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:35:04.933606 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:04.933572 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5brp" Apr 17 14:35:11.887383 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:11.887341 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:35:11.887383 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:11.887386 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:35:11.887831 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:11.887483 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:35:11.887831 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:11.887489 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:35:11.887831 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:11.887553 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:35:43.88753367 +0000 UTC m=+96.754919300 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:35:11.887831 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:11.887567 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:35:43.887561057 +0000 UTC m=+96.754946686 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:35:12.391828 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.391760 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:35:12.394405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.394384 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:35:12.402382 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:12.402362 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:35:12.402439 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:12.402429 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:36:16.402408938 +0000 UTC m=+129.269794570 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : secret "metrics-daemon-secret" not found Apr 17 14:35:12.492825 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.492774 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:35:12.495360 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.495338 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:35:12.505193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.505167 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:35:12.517767 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.517734 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9qw\" (UniqueName: \"kubernetes.io/projected/efde8bcb-629a-4cd7-9fe4-cea71e67b06e-kube-api-access-5f9qw\") pod \"network-check-target-jfnzx\" (UID: \"efde8bcb-629a-4cd7-9fe4-cea71e67b06e\") " pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:35:12.653049 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.652973 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mnbtt\"" Apr 17 14:35:12.660914 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.660881 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:35:12.784643 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:12.784599 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jfnzx"] Apr 17 14:35:12.790041 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:35:12.790012 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefde8bcb_629a_4cd7_9fe4_cea71e67b06e.slice/crio-a42f74b42c3ae5ee1ec0ca79667c90524a22a2ce79c2dfdec5df261438bea9d2 WatchSource:0}: Error finding container a42f74b42c3ae5ee1ec0ca79667c90524a22a2ce79c2dfdec5df261438bea9d2: Status 404 returned error can't find the container with id a42f74b42c3ae5ee1ec0ca79667c90524a22a2ce79c2dfdec5df261438bea9d2 Apr 17 14:35:13.005128 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:13.005093 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jfnzx" event={"ID":"efde8bcb-629a-4cd7-9fe4-cea71e67b06e","Type":"ContainerStarted","Data":"a42f74b42c3ae5ee1ec0ca79667c90524a22a2ce79c2dfdec5df261438bea9d2"} Apr 17 14:35:16.013537 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:16.013498 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jfnzx" event={"ID":"efde8bcb-629a-4cd7-9fe4-cea71e67b06e","Type":"ContainerStarted","Data":"6b46dca40b7fa4f81253609f0649e7848e36d2caad2736d16d1edc2536387855"} Apr 17 14:35:16.013949 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:16.013643 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:35:16.028236 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:16.028185 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jfnzx" podStartSLOduration=66.347522695 podStartE2EDuration="1m9.02817084s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:35:12.791999421 +0000 UTC m=+65.659385050" lastFinishedPulling="2026-04-17 14:35:15.472647564 +0000 UTC m=+68.340033195" observedRunningTime="2026-04-17 14:35:16.027648162 +0000 UTC m=+68.895033814" watchObservedRunningTime="2026-04-17 14:35:16.02817084 +0000 UTC m=+68.895556488" Apr 17 14:35:43.922900 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:43.922864 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:35:43.923320 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:43.922907 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:35:43.923320 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:43.923007 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:35:43.923320 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:43.923024 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:35:43.923320 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:43.923070 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert podName:76590649-d620-489b-9a3c-5c78ec32d35e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:47.923049436 +0000 UTC m=+160.790435067 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert") pod "ingress-canary-87v7h" (UID: "76590649-d620-489b-9a3c-5c78ec32d35e") : secret "canary-serving-cert" not found Apr 17 14:35:43.923320 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:35:43.923112 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls podName:fe006aa3-2754-4ceb-acc9-c8189d25053b nodeName:}" failed. No retries permitted until 2026-04-17 14:36:47.923097636 +0000 UTC m=+160.790483265 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls") pod "dns-default-kj9w5" (UID: "fe006aa3-2754-4ceb-acc9-c8189d25053b") : secret "dns-default-metrics-tls" not found Apr 17 14:35:47.017417 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:35:47.017381 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jfnzx" Apr 17 14:36:16.294034 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.293995 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl"] Apr 17 14:36:16.296852 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.296831 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" Apr 17 14:36:16.299272 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.299240 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9rj5v\"" Apr 17 14:36:16.300151 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.300132 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.300290 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.300226 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.304822 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.302397 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bss6z"] Apr 17 14:36:16.306418 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.306392 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b97fb5868-crdxs"] Apr 17 14:36:16.306535 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.306513 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.308604 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.308570 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 14:36:16.308879 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.308859 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bkfp6\"" Apr 17 14:36:16.308879 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.308873 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.309012 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.308864 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 14:36:16.309012 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.308979 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.309297 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.309280 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl"] Apr 17 14:36:16.309410 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.309389 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.311509 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.311490 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:36:16.311991 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.311971 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:36:16.312082 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.311987 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:36:16.312082 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.311988 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qzqq8\"" Apr 17 14:36:16.315549 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.315523 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bss6z"] Apr 17 14:36:16.315678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.315655 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 14:36:16.316923 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.316905 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b97fb5868-crdxs"] Apr 17 14:36:16.319497 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.319480 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:36:16.396480 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.396446 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4"] Apr 17 14:36:16.399314 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.399298 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:16.401443 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.401413 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.401443 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.401413 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.401645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.401453 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 14:36:16.401645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.401485 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-4xs5s\"" Apr 17 14:36:16.403789 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.403768 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc"] Apr 17 14:36:16.406666 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.406644 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.406938 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.406914 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4"] Apr 17 14:36:16.408978 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.408955 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.408978 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.408973 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 14:36:16.409128 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.408993 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jzhmt\"" Apr 17 14:36:16.409128 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.408976 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.409959 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.409943 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 14:36:16.414611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.414589 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc"] Apr 17 14:36:16.442460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442430 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-trusted-ca\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442462 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32387133-ec4d-425b-902f-441e1eccd234-ca-trust-extracted\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442481 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6vmk\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-kube-api-access-l6vmk\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442500 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-registry-certificates\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442544 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-trusted-ca\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.442655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442586 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:36:16.442655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442609 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-installation-pull-secrets\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442638 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-config\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442669 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhkrd\" (UniqueName: \"kubernetes.io/projected/6bab6f40-808e-4200-a47c-8888edae71b6-kube-api-access-dhkrd\") pod \"volume-data-source-validator-7c6cbb6c87-n2jzl\" (UID: \"6bab6f40-808e-4200-a47c-8888edae71b6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.442699 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442747 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-image-registry-private-configuration\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.442772 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs podName:fbcf40f6-2ec0-4fb3-85d8-30ecb284384d nodeName:}" failed. No retries permitted until 2026-04-17 14:38:18.442749449 +0000 UTC m=+251.310135085 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs") pod "network-metrics-daemon-4kdjq" (UID: "fbcf40f6-2ec0-4fb3-85d8-30ecb284384d") : secret "metrics-daemon-secret" not found Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442854 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-serving-cert\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442877 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4j7b\" (UniqueName: \"kubernetes.io/projected/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-kube-api-access-b4j7b\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442899 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.442966 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.442919 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-bound-sa-token\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.503052 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.503015 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8"] Apr 17 14:36:16.507543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.507434 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9kpqq"] Apr 17 14:36:16.507710 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.507688 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.509854 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.509828 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 14:36:16.510004 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.509985 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 14:36:16.510254 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.510240 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.510336 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.510317 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-b256d\"" Apr 17 14:36:16.510445 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.510429 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.510568 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.510551 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.513012 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.512982 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 14:36:16.513012 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.513007 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 14:36:16.513235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.513166 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:36:16.513301 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.513281 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mfjbl\"" Apr 17 14:36:16.513545 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.513533 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:36:16.517612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.517553 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8"] Apr 17 14:36:16.518863 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.518839 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9kpqq"] Apr 17 14:36:16.519342 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.519309 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 14:36:16.543990 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.543956 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhkrd\" (UniqueName: \"kubernetes.io/projected/6bab6f40-808e-4200-a47c-8888edae71b6-kube-api-access-dhkrd\") pod \"volume-data-source-validator-7c6cbb6c87-n2jzl\" (UID: \"6bab6f40-808e-4200-a47c-8888edae71b6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544018 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-image-registry-private-configuration\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544041 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-serving-cert\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544059 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4j7b\" (UniqueName: \"kubernetes.io/projected/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-kube-api-access-b4j7b\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544080 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544105 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-bound-sa-token\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544161 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-trusted-ca\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544203 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544232 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544257 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67hx\" (UniqueName: \"kubernetes.io/projected/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-kube-api-access-f67hx\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544293 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32387133-ec4d-425b-902f-441e1eccd234-ca-trust-extracted\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.544263 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.544325 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97fb5868-crdxs: secret "image-registry-tls" not found Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6vmk\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-kube-api-access-l6vmk\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544364 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.544389 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls podName:32387133-ec4d-425b-902f-441e1eccd234 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:17.044369035 +0000 UTC m=+129.911754679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls") pod "image-registry-7b97fb5868-crdxs" (UID: "32387133-ec4d-425b-902f-441e1eccd234") : secret "image-registry-tls" not found Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544475 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-registry-certificates\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544504 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-trusted-ca\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.544571 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544547 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-installation-pull-secrets\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.545140 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544583 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-config\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.545140 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.544630 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g4gw\" (UniqueName: \"kubernetes.io/projected/bb5ddf65-02bb-4168-a09c-03d98c828a9f-kube-api-access-6g4gw\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:16.545830 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.545782 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32387133-ec4d-425b-902f-441e1eccd234-ca-trust-extracted\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.545999 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.545936 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-registry-certificates\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.546065 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.546026 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-trusted-ca\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.546193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.546167 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-trusted-ca\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.546368 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.546343 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-config\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.547063 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.547044 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-serving-cert\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.547420 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.547401 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-image-registry-private-configuration\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.547493 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.547421 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-installation-pull-secrets\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.555467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.555443 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4j7b\" (UniqueName: \"kubernetes.io/projected/8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2-kube-api-access-b4j7b\") pod \"console-operator-9d4b6777b-bss6z\" (UID: \"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.555738 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.555718 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6vmk\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-kube-api-access-l6vmk\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.562381 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.562359 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhkrd\" (UniqueName: \"kubernetes.io/projected/6bab6f40-808e-4200-a47c-8888edae71b6-kube-api-access-dhkrd\") pod \"volume-data-source-validator-7c6cbb6c87-n2jzl\" (UID: \"6bab6f40-808e-4200-a47c-8888edae71b6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" Apr 17 14:36:16.565743 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.565715 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-bound-sa-token\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:16.607864 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.607825 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" Apr 17 14:36:16.618201 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.618172 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:16.646843 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646663 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjwr2\" (UniqueName: \"kubernetes.io/projected/c73897be-24cb-49ee-a735-e2c35eb461f4-kube-api-access-xjwr2\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.646843 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646742 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ae5f81-3a81-49e9-a3fb-cad167d0281b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.646843 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646779 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73897be-24cb-49ee-a735-e2c35eb461f4-service-ca-bundle\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.646843 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646839 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c73897be-24cb-49ee-a735-e2c35eb461f4-tmp\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646866 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73897be-24cb-49ee-a735-e2c35eb461f4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646920 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g4gw\" (UniqueName: \"kubernetes.io/projected/bb5ddf65-02bb-4168-a09c-03d98c828a9f-kube-api-access-6g4gw\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646958 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.646992 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgj27\" (UniqueName: \"kubernetes.io/projected/84ae5f81-3a81-49e9-a3fb-cad167d0281b-kube-api-access-fgj27\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.647021 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ae5f81-3a81-49e9-a3fb-cad167d0281b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.647049 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f67hx\" (UniqueName: \"kubernetes.io/projected/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-kube-api-access-f67hx\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.647077 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.647104 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.647132 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c73897be-24cb-49ee-a735-e2c35eb461f4-snapshots\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.647193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.647158 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73897be-24cb-49ee-a735-e2c35eb461f4-serving-cert\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.647655 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.647623 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:16.647719 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.647657 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:36:16.647756 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.647719 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls podName:1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:17.147696264 +0000 UTC m=+130.015081895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4mfpc" (UID: "1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:16.648315 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.648291 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.648437 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:16.648419 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls podName:bb5ddf65-02bb-4168-a09c-03d98c828a9f nodeName:}" failed. No retries permitted until 2026-04-17 14:36:17.148401417 +0000 UTC m=+130.015787050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbzj4" (UID: "bb5ddf65-02bb-4168-a09c-03d98c828a9f") : secret "samples-operator-tls" not found Apr 17 14:36:16.655827 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.655782 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g4gw\" (UniqueName: \"kubernetes.io/projected/bb5ddf65-02bb-4168-a09c-03d98c828a9f-kube-api-access-6g4gw\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:16.656576 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.656534 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67hx\" (UniqueName: \"kubernetes.io/projected/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-kube-api-access-f67hx\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:16.734509 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.734473 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl"] Apr 17 14:36:16.738031 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:16.737990 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bab6f40_808e_4200_a47c_8888edae71b6.slice/crio-b71aa8e6d66be4c3a7bbde664f2adb0d99582ecfe70124c3de390d3be3ecf704 WatchSource:0}: Error finding container b71aa8e6d66be4c3a7bbde664f2adb0d99582ecfe70124c3de390d3be3ecf704: Status 404 returned error can't find the container with id b71aa8e6d66be4c3a7bbde664f2adb0d99582ecfe70124c3de390d3be3ecf704 Apr 17 14:36:16.748182 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748158 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c73897be-24cb-49ee-a735-e2c35eb461f4-snapshots\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748266 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748190 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73897be-24cb-49ee-a735-e2c35eb461f4-serving-cert\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748266 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748216 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjwr2\" (UniqueName: \"kubernetes.io/projected/c73897be-24cb-49ee-a735-e2c35eb461f4-kube-api-access-xjwr2\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748375 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748352 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ae5f81-3a81-49e9-a3fb-cad167d0281b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.748422 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748408 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73897be-24cb-49ee-a735-e2c35eb461f4-service-ca-bundle\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748483 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748469 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c73897be-24cb-49ee-a735-e2c35eb461f4-tmp\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748521 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748504 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73897be-24cb-49ee-a735-e2c35eb461f4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748627 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748603 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgj27\" (UniqueName: \"kubernetes.io/projected/84ae5f81-3a81-49e9-a3fb-cad167d0281b-kube-api-access-fgj27\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.748726 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748643 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ae5f81-3a81-49e9-a3fb-cad167d0281b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.748788 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c73897be-24cb-49ee-a735-e2c35eb461f4-tmp\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.748938 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.748915 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c73897be-24cb-49ee-a735-e2c35eb461f4-snapshots\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.749066 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.749048 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73897be-24cb-49ee-a735-e2c35eb461f4-service-ca-bundle\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.749115 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.749102 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ae5f81-3a81-49e9-a3fb-cad167d0281b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.749864 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.749845 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c73897be-24cb-49ee-a735-e2c35eb461f4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.750881 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.750864 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73897be-24cb-49ee-a735-e2c35eb461f4-serving-cert\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.751712 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.751686 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ae5f81-3a81-49e9-a3fb-cad167d0281b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.752617 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.752584 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bss6z"] Apr 17 14:36:16.755707 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:16.755680 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf1a0e0_95fc_4bce_84ed_e3222e5beeb2.slice/crio-e3c712ce229b46488d238f5dbc3a7c0d54f08d2ad378b5dc6782e26b706bc91a WatchSource:0}: Error finding container e3c712ce229b46488d238f5dbc3a7c0d54f08d2ad378b5dc6782e26b706bc91a: Status 404 returned error can't find the container with id e3c712ce229b46488d238f5dbc3a7c0d54f08d2ad378b5dc6782e26b706bc91a Apr 17 14:36:16.756610 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.756591 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjwr2\" (UniqueName: \"kubernetes.io/projected/c73897be-24cb-49ee-a735-e2c35eb461f4-kube-api-access-xjwr2\") pod \"insights-operator-585dfdc468-9kpqq\" (UID: \"c73897be-24cb-49ee-a735-e2c35eb461f4\") " pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.756692 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.756645 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgj27\" (UniqueName: \"kubernetes.io/projected/84ae5f81-3a81-49e9-a3fb-cad167d0281b-kube-api-access-fgj27\") pod \"kube-storage-version-migrator-operator-6769c5d45-qq9b8\" (UID: \"84ae5f81-3a81-49e9-a3fb-cad167d0281b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.820160 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.820069 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" Apr 17 14:36:16.824861 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.824835 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" Apr 17 14:36:16.940583 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.940551 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8"] Apr 17 14:36:16.944651 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:16.944610 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ae5f81_3a81_49e9_a3fb_cad167d0281b.slice/crio-296f29b848df80ed6ed34735e7de9a16229b2b917531c26a60bc1216de13716b WatchSource:0}: Error finding container 296f29b848df80ed6ed34735e7de9a16229b2b917531c26a60bc1216de13716b: Status 404 returned error can't find the container with id 296f29b848df80ed6ed34735e7de9a16229b2b917531c26a60bc1216de13716b Apr 17 14:36:16.958564 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:16.958536 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9kpqq"] Apr 17 14:36:16.961407 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:16.961380 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73897be_24cb_49ee_a735_e2c35eb461f4.slice/crio-d0d251c236089f74f44a9d6478e6d2f078e3cc9ac453ed9c521cd6e7bdc92d02 WatchSource:0}: Error finding container d0d251c236089f74f44a9d6478e6d2f078e3cc9ac453ed9c521cd6e7bdc92d02: Status 404 returned error can't find the container with id d0d251c236089f74f44a9d6478e6d2f078e3cc9ac453ed9c521cd6e7bdc92d02 Apr 17 14:36:17.052047 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.052014 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:17.052220 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.052128 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:36:17.052220 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.052140 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97fb5868-crdxs: secret "image-registry-tls" not found Apr 17 14:36:17.052220 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.052191 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls podName:32387133-ec4d-425b-902f-441e1eccd234 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:18.05217712 +0000 UTC m=+130.919562748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls") pod "image-registry-7b97fb5868-crdxs" (UID: "32387133-ec4d-425b-902f-441e1eccd234") : secret "image-registry-tls" not found Apr 17 14:36:17.125771 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.125678 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" event={"ID":"6bab6f40-808e-4200-a47c-8888edae71b6","Type":"ContainerStarted","Data":"b71aa8e6d66be4c3a7bbde664f2adb0d99582ecfe70124c3de390d3be3ecf704"} Apr 17 14:36:17.126596 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.126568 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" event={"ID":"c73897be-24cb-49ee-a735-e2c35eb461f4","Type":"ContainerStarted","Data":"d0d251c236089f74f44a9d6478e6d2f078e3cc9ac453ed9c521cd6e7bdc92d02"} Apr 17 14:36:17.127404 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.127384 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" event={"ID":"84ae5f81-3a81-49e9-a3fb-cad167d0281b","Type":"ContainerStarted","Data":"296f29b848df80ed6ed34735e7de9a16229b2b917531c26a60bc1216de13716b"} Apr 17 14:36:17.128252 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.128231 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" event={"ID":"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2","Type":"ContainerStarted","Data":"e3c712ce229b46488d238f5dbc3a7c0d54f08d2ad378b5dc6782e26b706bc91a"} Apr 17 14:36:17.152723 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.152689 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:17.152723 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:17.152730 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:17.152940 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.152892 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:36:17.152977 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.152955 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls podName:bb5ddf65-02bb-4168-a09c-03d98c828a9f nodeName:}" failed. No retries permitted until 2026-04-17 14:36:18.152939201 +0000 UTC m=+131.020324830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbzj4" (UID: "bb5ddf65-02bb-4168-a09c-03d98c828a9f") : secret "samples-operator-tls" not found Apr 17 14:36:17.153022 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.152892 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:17.153067 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:17.153042 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls podName:1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:18.153025104 +0000 UTC m=+131.020410743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4mfpc" (UID: "1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:18.061446 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:18.061394 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:18.061938 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.061570 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:36:18.061938 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.061594 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97fb5868-crdxs: secret "image-registry-tls" not found Apr 17 14:36:18.061938 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.061659 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls podName:32387133-ec4d-425b-902f-441e1eccd234 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:20.061637699 +0000 UTC m=+132.929023352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls") pod "image-registry-7b97fb5868-crdxs" (UID: "32387133-ec4d-425b-902f-441e1eccd234") : secret "image-registry-tls" not found Apr 17 14:36:18.162824 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:18.162505 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:18.162824 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:18.162574 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:18.162824 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.162677 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:18.162824 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.162728 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:36:18.162824 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.162757 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls podName:1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:20.162735322 +0000 UTC m=+133.030120952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4mfpc" (UID: "1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:18.162824 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:18.162775 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls podName:bb5ddf65-02bb-4168-a09c-03d98c828a9f nodeName:}" failed. No retries permitted until 2026-04-17 14:36:20.162765167 +0000 UTC m=+133.030150819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbzj4" (UID: "bb5ddf65-02bb-4168-a09c-03d98c828a9f") : secret "samples-operator-tls" not found Apr 17 14:36:19.007031 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.006997 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx"] Apr 17 14:36:19.010038 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.010013 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" Apr 17 14:36:19.012231 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.012208 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rkmlt\"" Apr 17 14:36:19.016175 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.016130 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx"] Apr 17 14:36:19.133823 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.133759 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" event={"ID":"6bab6f40-808e-4200-a47c-8888edae71b6","Type":"ContainerStarted","Data":"af572655d16f5da780802b9f9438439587c490fe4a64af42c5697315d468480c"} Apr 17 14:36:19.148369 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.148319 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n2jzl" podStartSLOduration=1.5737938 podStartE2EDuration="3.148304153s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:16.739749878 +0000 UTC m=+129.607135507" lastFinishedPulling="2026-04-17 14:36:18.314260216 +0000 UTC m=+131.181645860" observedRunningTime="2026-04-17 14:36:19.147234572 +0000 UTC m=+132.014620223" watchObservedRunningTime="2026-04-17 14:36:19.148304153 +0000 UTC m=+132.015689803" Apr 17 14:36:19.171699 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.171656 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4d8\" (UniqueName: \"kubernetes.io/projected/93ac31c3-23e6-4891-a746-c22dabdbc864-kube-api-access-hv4d8\") pod \"network-check-source-8894fc9bd-rl7lx\" (UID: \"93ac31c3-23e6-4891-a746-c22dabdbc864\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" Apr 17 14:36:19.273284 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.273181 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4d8\" (UniqueName: \"kubernetes.io/projected/93ac31c3-23e6-4891-a746-c22dabdbc864-kube-api-access-hv4d8\") pod \"network-check-source-8894fc9bd-rl7lx\" (UID: \"93ac31c3-23e6-4891-a746-c22dabdbc864\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" Apr 17 14:36:19.281418 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.281387 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4d8\" (UniqueName: \"kubernetes.io/projected/93ac31c3-23e6-4891-a746-c22dabdbc864-kube-api-access-hv4d8\") pod \"network-check-source-8894fc9bd-rl7lx\" (UID: \"93ac31c3-23e6-4891-a746-c22dabdbc864\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" Apr 17 14:36:19.322188 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:19.322127 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" Apr 17 14:36:20.080580 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:20.080548 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:20.080736 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.080664 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:36:20.080736 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.080678 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97fb5868-crdxs: secret "image-registry-tls" not found Apr 17 14:36:20.080824 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.080738 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls podName:32387133-ec4d-425b-902f-441e1eccd234 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:24.080722038 +0000 UTC m=+136.948107668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls") pod "image-registry-7b97fb5868-crdxs" (UID: "32387133-ec4d-425b-902f-441e1eccd234") : secret "image-registry-tls" not found Apr 17 14:36:20.181123 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:20.181093 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:20.181469 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:20.181145 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:20.181469 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.181231 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:20.181469 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.181298 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls podName:1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:24.181280508 +0000 UTC m=+137.048666137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4mfpc" (UID: "1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:20.181469 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.181303 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:36:20.181469 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:20.181380 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls podName:bb5ddf65-02bb-4168-a09c-03d98c828a9f nodeName:}" failed. No retries permitted until 2026-04-17 14:36:24.181361135 +0000 UTC m=+137.048746768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbzj4" (UID: "bb5ddf65-02bb-4168-a09c-03d98c828a9f") : secret "samples-operator-tls" not found Apr 17 14:36:20.288312 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:20.288254 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx"] Apr 17 14:36:20.294368 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:20.294322 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ac31c3_23e6_4891_a746_c22dabdbc864.slice/crio-150a8fda0885b3f28af4e07cf03559899de1abd1650592d235c2aa28a94a90bc WatchSource:0}: Error finding container 150a8fda0885b3f28af4e07cf03559899de1abd1650592d235c2aa28a94a90bc: Status 404 returned error can't find the container with id 150a8fda0885b3f28af4e07cf03559899de1abd1650592d235c2aa28a94a90bc Apr 17 14:36:21.140798 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.140753 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" event={"ID":"93ac31c3-23e6-4891-a746-c22dabdbc864","Type":"ContainerStarted","Data":"2250bc7760a9c48411009a609268dc94bf6760885befca0f7184893d3d9abeb5"} Apr 17 14:36:21.141008 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.140825 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" event={"ID":"93ac31c3-23e6-4891-a746-c22dabdbc864","Type":"ContainerStarted","Data":"150a8fda0885b3f28af4e07cf03559899de1abd1650592d235c2aa28a94a90bc"} Apr 17 14:36:21.142120 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.142081 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" event={"ID":"c73897be-24cb-49ee-a735-e2c35eb461f4","Type":"ContainerStarted","Data":"185f47a56f260d76058c154b7396d3d408787c493b43b66122d0ab7bfcc9f539"} Apr 17 14:36:21.143478 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.143449 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" event={"ID":"84ae5f81-3a81-49e9-a3fb-cad167d0281b","Type":"ContainerStarted","Data":"2a7a50fde6eb370fdc2e1d717865fb5580677affa7cdf3f53e76fa7ef14a75f7"} Apr 17 14:36:21.145192 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.145169 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/0.log" Apr 17 14:36:21.145314 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.145213 2582 generic.go:358] "Generic (PLEG): container finished" podID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" containerID="f309c7e07747a81fd97f20d0b98049d19e450b48e07fb792e5ba2ff6f4f01a6a" exitCode=255 Apr 17 14:36:21.145314 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.145249 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" event={"ID":"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2","Type":"ContainerDied","Data":"f309c7e07747a81fd97f20d0b98049d19e450b48e07fb792e5ba2ff6f4f01a6a"} Apr 17 14:36:21.145493 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.145479 2582 scope.go:117] "RemoveContainer" containerID="f309c7e07747a81fd97f20d0b98049d19e450b48e07fb792e5ba2ff6f4f01a6a" Apr 17 14:36:21.155075 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.155019 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rl7lx" podStartSLOduration=3.155003928 podStartE2EDuration="3.155003928s" podCreationTimestamp="2026-04-17 14:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:36:21.154985889 +0000 UTC m=+134.022371540" watchObservedRunningTime="2026-04-17 14:36:21.155003928 +0000 UTC m=+134.022389579" Apr 17 14:36:21.209024 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.208956 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" podStartSLOduration=2.015531566 podStartE2EDuration="5.208938913s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:16.963070382 +0000 UTC m=+129.830456010" lastFinishedPulling="2026-04-17 14:36:20.156477719 +0000 UTC m=+133.023863357" observedRunningTime="2026-04-17 14:36:21.187909695 +0000 UTC m=+134.055295347" watchObservedRunningTime="2026-04-17 14:36:21.208938913 +0000 UTC m=+134.076324592" Apr 17 14:36:21.210339 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.210295 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" podStartSLOduration=2.007265884 podStartE2EDuration="5.21028146s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:16.946678128 +0000 UTC m=+129.814063758" lastFinishedPulling="2026-04-17 14:36:20.149693702 +0000 UTC m=+133.017079334" observedRunningTime="2026-04-17 14:36:21.208442887 +0000 UTC m=+134.075828538" watchObservedRunningTime="2026-04-17 14:36:21.21028146 +0000 UTC m=+134.077667111" Apr 17 14:36:21.428329 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.428251 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw"] Apr 17 14:36:21.431141 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.431124 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" Apr 17 14:36:21.433834 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.433792 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 14:36:21.433971 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.433852 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-nkgj4\"" Apr 17 14:36:21.433971 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.433878 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 14:36:21.439166 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.439145 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw"] Apr 17 14:36:21.591978 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.591931 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh44d\" (UniqueName: \"kubernetes.io/projected/c3698900-25ed-4e05-b420-0a5a402ecaac-kube-api-access-mh44d\") pod \"migrator-74bb7799d9-kfqfw\" (UID: \"c3698900-25ed-4e05-b420-0a5a402ecaac\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" Apr 17 14:36:21.693181 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.693091 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh44d\" (UniqueName: \"kubernetes.io/projected/c3698900-25ed-4e05-b420-0a5a402ecaac-kube-api-access-mh44d\") pod \"migrator-74bb7799d9-kfqfw\" (UID: \"c3698900-25ed-4e05-b420-0a5a402ecaac\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" Apr 17 14:36:21.701000 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.700972 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh44d\" (UniqueName: \"kubernetes.io/projected/c3698900-25ed-4e05-b420-0a5a402ecaac-kube-api-access-mh44d\") pod \"migrator-74bb7799d9-kfqfw\" (UID: \"c3698900-25ed-4e05-b420-0a5a402ecaac\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" Apr 17 14:36:21.740750 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.740720 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" Apr 17 14:36:21.869387 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:21.869352 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw"] Apr 17 14:36:21.874104 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:21.874072 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3698900_25ed_4e05_b420_0a5a402ecaac.slice/crio-bc0da73fbd3379a961111fbde69d4a25ac626bafe1813a0cbb3bef58c46b2b96 WatchSource:0}: Error finding container bc0da73fbd3379a961111fbde69d4a25ac626bafe1813a0cbb3bef58c46b2b96: Status 404 returned error can't find the container with id bc0da73fbd3379a961111fbde69d4a25ac626bafe1813a0cbb3bef58c46b2b96 Apr 17 14:36:22.149292 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.149262 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/1.log" Apr 17 14:36:22.151429 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.151407 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/0.log" Apr 17 14:36:22.151549 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.151445 2582 generic.go:358] "Generic (PLEG): container finished" podID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" containerID="acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4" exitCode=255 Apr 17 14:36:22.151549 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.151478 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" event={"ID":"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2","Type":"ContainerDied","Data":"acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4"} Apr 17 14:36:22.151549 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.151533 2582 scope.go:117] "RemoveContainer" containerID="f309c7e07747a81fd97f20d0b98049d19e450b48e07fb792e5ba2ff6f4f01a6a" Apr 17 14:36:22.151847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.151825 2582 scope.go:117] "RemoveContainer" containerID="acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4" Apr 17 14:36:22.152102 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:22.152074 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bss6z_openshift-console-operator(8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podUID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" Apr 17 14:36:22.152840 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:22.152784 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" event={"ID":"c3698900-25ed-4e05-b420-0a5a402ecaac","Type":"ContainerStarted","Data":"bc0da73fbd3379a961111fbde69d4a25ac626bafe1813a0cbb3bef58c46b2b96"} Apr 17 14:36:23.017372 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.017342 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m5qlh_542f49ba-8bb4-4178-9c98-a94bc1f60de1/dns-node-resolver/0.log" Apr 17 14:36:23.157525 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.157500 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/1.log" Apr 17 14:36:23.157921 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.157903 2582 scope.go:117] "RemoveContainer" containerID="acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4" Apr 17 14:36:23.158139 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:23.158106 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bss6z_openshift-console-operator(8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podUID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" Apr 17 14:36:23.159116 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.159095 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" event={"ID":"c3698900-25ed-4e05-b420-0a5a402ecaac","Type":"ContainerStarted","Data":"4ff67d767570d4594e8f8301c73d836e49b04cfadd16e4d7772083417c8a5f15"} Apr 17 14:36:23.159178 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.159123 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" event={"ID":"c3698900-25ed-4e05-b420-0a5a402ecaac","Type":"ContainerStarted","Data":"7d042ece3f685c7eec4c518523aaa9384fad6eef60f18a5a27e7be21390d66d5"} Apr 17 14:36:23.184926 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.184874 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kfqfw" podStartSLOduration=1.031615514 podStartE2EDuration="2.18486053s" podCreationTimestamp="2026-04-17 14:36:21 +0000 UTC" firstStartedPulling="2026-04-17 14:36:21.876418267 +0000 UTC m=+134.743803896" lastFinishedPulling="2026-04-17 14:36:23.029663279 +0000 UTC m=+135.897048912" observedRunningTime="2026-04-17 14:36:23.184569286 +0000 UTC m=+136.051954938" watchObservedRunningTime="2026-04-17 14:36:23.18486053 +0000 UTC m=+136.052246172" Apr 17 14:36:23.455079 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.454999 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wdfn7"] Apr 17 14:36:23.458006 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.457989 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.460537 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.460512 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 14:36:23.460537 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.460522 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qd6zh\"" Apr 17 14:36:23.460746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.460578 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 14:36:23.460746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.460592 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 14:36:23.460746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.460578 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 14:36:23.464183 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.464159 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wdfn7"] Apr 17 14:36:23.611020 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.610980 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/253c37af-663d-4fa7-aa2f-a8e578e9af55-signing-key\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.611151 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.611031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/253c37af-663d-4fa7-aa2f-a8e578e9af55-signing-cabundle\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.611218 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.611163 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgmk\" (UniqueName: \"kubernetes.io/projected/253c37af-663d-4fa7-aa2f-a8e578e9af55-kube-api-access-gdgmk\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.712678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.712585 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/253c37af-663d-4fa7-aa2f-a8e578e9af55-signing-key\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.712678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.712623 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/253c37af-663d-4fa7-aa2f-a8e578e9af55-signing-cabundle\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.712886 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.712834 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgmk\" (UniqueName: \"kubernetes.io/projected/253c37af-663d-4fa7-aa2f-a8e578e9af55-kube-api-access-gdgmk\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.713271 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.713250 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/253c37af-663d-4fa7-aa2f-a8e578e9af55-signing-cabundle\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.715196 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.715174 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/253c37af-663d-4fa7-aa2f-a8e578e9af55-signing-key\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.720502 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.720480 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgmk\" (UniqueName: \"kubernetes.io/projected/253c37af-663d-4fa7-aa2f-a8e578e9af55-kube-api-access-gdgmk\") pod \"service-ca-865cb79987-wdfn7\" (UID: \"253c37af-663d-4fa7-aa2f-a8e578e9af55\") " pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.767737 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.767704 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-wdfn7" Apr 17 14:36:23.882229 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:23.882203 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wdfn7"] Apr 17 14:36:23.884696 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:23.884670 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod253c37af_663d_4fa7_aa2f_a8e578e9af55.slice/crio-fd6877d33adb717f86af2abe676435e0e7e03a31c80a4d73aff62a591fe58a05 WatchSource:0}: Error finding container fd6877d33adb717f86af2abe676435e0e7e03a31c80a4d73aff62a591fe58a05: Status 404 returned error can't find the container with id fd6877d33adb717f86af2abe676435e0e7e03a31c80a4d73aff62a591fe58a05 Apr 17 14:36:24.116061 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:24.116008 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:24.116428 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.116176 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:36:24.116428 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.116198 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97fb5868-crdxs: secret "image-registry-tls" not found Apr 17 14:36:24.116428 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.116268 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls podName:32387133-ec4d-425b-902f-441e1eccd234 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:32.116251986 +0000 UTC m=+144.983637616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls") pod "image-registry-7b97fb5868-crdxs" (UID: "32387133-ec4d-425b-902f-441e1eccd234") : secret "image-registry-tls" not found Apr 17 14:36:24.162896 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:24.162859 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-wdfn7" event={"ID":"253c37af-663d-4fa7-aa2f-a8e578e9af55","Type":"ContainerStarted","Data":"fd6877d33adb717f86af2abe676435e0e7e03a31c80a4d73aff62a591fe58a05"} Apr 17 14:36:24.216592 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:24.216549 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:24.216592 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:24.216594 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:24.216798 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.216689 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:36:24.216798 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.216696 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:24.216798 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.216742 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls podName:bb5ddf65-02bb-4168-a09c-03d98c828a9f nodeName:}" failed. No retries permitted until 2026-04-17 14:36:32.216730503 +0000 UTC m=+145.084116131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbzj4" (UID: "bb5ddf65-02bb-4168-a09c-03d98c828a9f") : secret "samples-operator-tls" not found Apr 17 14:36:24.216798 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:24.216755 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls podName:1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:32.216749025 +0000 UTC m=+145.084134655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4mfpc" (UID: "1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:24.217185 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:24.217169 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rhww7_82ff5ce9-528b-4d19-9a09-ac7e64ef9d46/node-ca/0.log" Apr 17 14:36:26.169452 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:26.169365 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-wdfn7" event={"ID":"253c37af-663d-4fa7-aa2f-a8e578e9af55","Type":"ContainerStarted","Data":"1d98ef1c02e5559ef65e4ca10555a87a2c905307f6a8cd8a2fe900792daf6e22"} Apr 17 14:36:26.185860 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:26.185768 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-wdfn7" podStartSLOduration=1.255068965 podStartE2EDuration="3.185754042s" podCreationTimestamp="2026-04-17 14:36:23 +0000 UTC" firstStartedPulling="2026-04-17 14:36:23.886417075 +0000 UTC m=+136.753802704" lastFinishedPulling="2026-04-17 14:36:25.817102152 +0000 UTC m=+138.684487781" observedRunningTime="2026-04-17 14:36:26.18439504 +0000 UTC m=+139.051780702" watchObservedRunningTime="2026-04-17 14:36:26.185754042 +0000 UTC m=+139.053139694" Apr 17 14:36:26.618357 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:26.618315 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:26.618357 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:26.618361 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:26.618718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:26.618704 2582 scope.go:117] "RemoveContainer" containerID="acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4" Apr 17 14:36:26.618926 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:26.618908 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bss6z_openshift-console-operator(8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podUID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" Apr 17 14:36:32.188998 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.188961 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:32.191467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.191442 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"image-registry-7b97fb5868-crdxs\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:32.223451 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.223399 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:32.290283 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.290254 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:32.290485 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.290469 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:32.290737 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:32.290413 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:32.290737 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:32.290710 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls podName:1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:48.290687799 +0000 UTC m=+161.158073431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4mfpc" (UID: "1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 14:36:32.293210 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.293185 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb5ddf65-02bb-4168-a09c-03d98c828a9f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbzj4\" (UID: \"bb5ddf65-02bb-4168-a09c-03d98c828a9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:32.308412 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.308382 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" Apr 17 14:36:32.345733 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.345700 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b97fb5868-crdxs"] Apr 17 14:36:32.349988 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:32.349963 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32387133_ec4d_425b_902f_441e1eccd234.slice/crio-f43e468af173a742b828bd57f0c69123a8bb11a3c5afcdf0e89e0b8655b06e50 WatchSource:0}: Error finding container f43e468af173a742b828bd57f0c69123a8bb11a3c5afcdf0e89e0b8655b06e50: Status 404 returned error can't find the container with id f43e468af173a742b828bd57f0c69123a8bb11a3c5afcdf0e89e0b8655b06e50 Apr 17 14:36:32.430493 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:32.430463 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4"] Apr 17 14:36:33.187793 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:33.187761 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" event={"ID":"bb5ddf65-02bb-4168-a09c-03d98c828a9f","Type":"ContainerStarted","Data":"250cd605c2af40677d3c027efc66dbb5a08712f391a05a6e904c8aa8397d59ef"} Apr 17 14:36:33.189395 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:33.189364 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" event={"ID":"32387133-ec4d-425b-902f-441e1eccd234","Type":"ContainerStarted","Data":"ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9"} Apr 17 14:36:33.189395 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:33.189397 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" event={"ID":"32387133-ec4d-425b-902f-441e1eccd234","Type":"ContainerStarted","Data":"f43e468af173a742b828bd57f0c69123a8bb11a3c5afcdf0e89e0b8655b06e50"} Apr 17 14:36:33.189850 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:33.189528 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:36:33.211225 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:33.211170 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" podStartSLOduration=17.211152306 podStartE2EDuration="17.211152306s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:36:33.20964507 +0000 UTC m=+146.077030722" watchObservedRunningTime="2026-04-17 14:36:33.211152306 +0000 UTC m=+146.078537956" Apr 17 14:36:35.196113 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:35.196075 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" event={"ID":"bb5ddf65-02bb-4168-a09c-03d98c828a9f","Type":"ContainerStarted","Data":"124841a17d3df045e52acf9e431ea8f6d8358f533221f85fb91f389da48fb665"} Apr 17 14:36:35.196543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:35.196120 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" event={"ID":"bb5ddf65-02bb-4168-a09c-03d98c828a9f","Type":"ContainerStarted","Data":"d1206015cf68e6e027822a5d11da578683395e3218e7e1430328d13ed9fe2144"} Apr 17 14:36:35.211531 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:35.211480 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbzj4" podStartSLOduration=17.38996058 podStartE2EDuration="19.211466615s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:32.467591494 +0000 UTC m=+145.334977124" lastFinishedPulling="2026-04-17 14:36:34.289097515 +0000 UTC m=+147.156483159" observedRunningTime="2026-04-17 14:36:35.210342678 +0000 UTC m=+148.077728330" watchObservedRunningTime="2026-04-17 14:36:35.211466615 +0000 UTC m=+148.078852265" Apr 17 14:36:39.739151 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:39.739114 2582 scope.go:117] "RemoveContainer" containerID="acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4" Apr 17 14:36:40.209441 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:40.209410 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:36:40.209780 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:40.209764 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/1.log" Apr 17 14:36:40.209885 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:40.209818 2582 generic.go:358] "Generic (PLEG): container finished" podID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" containerID="7a3939483d16566b5a94d8d09608ae7ae530efb345bc06f409479481a8e52166" exitCode=255 Apr 17 14:36:40.209885 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:40.209847 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" event={"ID":"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2","Type":"ContainerDied","Data":"7a3939483d16566b5a94d8d09608ae7ae530efb345bc06f409479481a8e52166"} Apr 17 14:36:40.209885 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:40.209881 2582 scope.go:117] "RemoveContainer" containerID="acdb42d2de8d06690ab4dc529874ed26f0099fe16fdab146bd8df88b0f8beef4" Apr 17 14:36:40.210290 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:40.210267 2582 scope.go:117] "RemoveContainer" containerID="7a3939483d16566b5a94d8d09608ae7ae530efb345bc06f409479481a8e52166" Apr 17 14:36:40.210495 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:40.210475 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-bss6z_openshift-console-operator(8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podUID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" Apr 17 14:36:41.213184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.213153 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:36:41.870776 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.870742 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kqx2d"] Apr 17 14:36:41.875144 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.875121 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:41.877506 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.877483 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qsgzg\"" Apr 17 14:36:41.877625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.877566 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:36:41.877625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.877566 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:36:41.885195 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.885173 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kqx2d"] Apr 17 14:36:41.908744 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.908715 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b97fb5868-crdxs"] Apr 17 14:36:41.970169 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.970124 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d0785d07-4d16-482b-a416-61f38c7665f3-data-volume\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:41.970338 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.970200 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4f2q\" (UniqueName: \"kubernetes.io/projected/d0785d07-4d16-482b-a416-61f38c7665f3-kube-api-access-d4f2q\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:41.970338 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.970265 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d0785d07-4d16-482b-a416-61f38c7665f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:41.970338 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.970330 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d0785d07-4d16-482b-a416-61f38c7665f3-crio-socket\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:41.970431 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:41.970375 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d0785d07-4d16-482b-a416-61f38c7665f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071213 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071177 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4f2q\" (UniqueName: \"kubernetes.io/projected/d0785d07-4d16-482b-a416-61f38c7665f3-kube-api-access-d4f2q\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071213 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071217 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d0785d07-4d16-482b-a416-61f38c7665f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071428 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071400 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d0785d07-4d16-482b-a416-61f38c7665f3-crio-socket\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071498 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071480 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d0785d07-4d16-482b-a416-61f38c7665f3-crio-socket\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071532 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071496 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d0785d07-4d16-482b-a416-61f38c7665f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071569 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d0785d07-4d16-482b-a416-61f38c7665f3-data-volume\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.071964 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.071943 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d0785d07-4d16-482b-a416-61f38c7665f3-data-volume\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.072081 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.072065 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d0785d07-4d16-482b-a416-61f38c7665f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.073673 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.073657 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d0785d07-4d16-482b-a416-61f38c7665f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.080640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.080619 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4f2q\" (UniqueName: \"kubernetes.io/projected/d0785d07-4d16-482b-a416-61f38c7665f3-kube-api-access-d4f2q\") pod \"insights-runtime-extractor-kqx2d\" (UID: \"d0785d07-4d16-482b-a416-61f38c7665f3\") " pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.185016 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.184926 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kqx2d" Apr 17 14:36:42.306612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:42.306579 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kqx2d"] Apr 17 14:36:42.310122 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:42.310095 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0785d07_4d16_482b_a416_61f38c7665f3.slice/crio-61b7cee71026fbdb23c47cc7ef0071d4059c6324e9aab5d7d3ecde1b1a65e7c9 WatchSource:0}: Error finding container 61b7cee71026fbdb23c47cc7ef0071d4059c6324e9aab5d7d3ecde1b1a65e7c9: Status 404 returned error can't find the container with id 61b7cee71026fbdb23c47cc7ef0071d4059c6324e9aab5d7d3ecde1b1a65e7c9 Apr 17 14:36:43.025405 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:43.025363 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kj9w5" podUID="fe006aa3-2754-4ceb-acc9-c8189d25053b" Apr 17 14:36:43.034667 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:43.034628 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-87v7h" podUID="76590649-d620-489b-9a3c-5c78ec32d35e" Apr 17 14:36:43.220957 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:43.220925 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kqx2d" event={"ID":"d0785d07-4d16-482b-a416-61f38c7665f3","Type":"ContainerStarted","Data":"d3c51609517ba55b1e57b78e622a4e70624a745e14b989ec7d3c34ea3427b95d"} Apr 17 14:36:43.220957 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:43.220955 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:36:43.221119 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:43.220966 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kqx2d" event={"ID":"d0785d07-4d16-482b-a416-61f38c7665f3","Type":"ContainerStarted","Data":"5cea5ae45a171e4a320bd4b9548023d6e735cf6af68b4ed8f5659e7c29ba72c9"} Apr 17 14:36:43.221119 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:43.220977 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kqx2d" event={"ID":"d0785d07-4d16-482b-a416-61f38c7665f3","Type":"ContainerStarted","Data":"61b7cee71026fbdb23c47cc7ef0071d4059c6324e9aab5d7d3ecde1b1a65e7c9"} Apr 17 14:36:43.221119 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:43.220937 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kj9w5" Apr 17 14:36:44.756212 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:44.756163 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4kdjq" podUID="fbcf40f6-2ec0-4fb3-85d8-30ecb284384d" Apr 17 14:36:45.227081 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:45.226991 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kqx2d" event={"ID":"d0785d07-4d16-482b-a416-61f38c7665f3","Type":"ContainerStarted","Data":"241f645a4aa6763f774edabe309d7c129a3ce23a79c85b04c7863abf6572a5ea"} Apr 17 14:36:45.244294 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:45.244239 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kqx2d" podStartSLOduration=1.822284318 podStartE2EDuration="4.244224836s" podCreationTimestamp="2026-04-17 14:36:41 +0000 UTC" firstStartedPulling="2026-04-17 14:36:42.375306017 +0000 UTC m=+155.242691649" lastFinishedPulling="2026-04-17 14:36:44.797246534 +0000 UTC m=+157.664632167" observedRunningTime="2026-04-17 14:36:45.243874583 +0000 UTC m=+158.111260234" watchObservedRunningTime="2026-04-17 14:36:45.244224836 +0000 UTC m=+158.111610511" Apr 17 14:36:46.619207 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:46.619170 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:46.619598 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:46.619219 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:36:46.619598 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:46.619559 2582 scope.go:117] "RemoveContainer" containerID="7a3939483d16566b5a94d8d09608ae7ae530efb345bc06f409479481a8e52166" Apr 17 14:36:46.619741 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:46.619722 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-bss6z_openshift-console-operator(8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podUID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" Apr 17 14:36:48.023583 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.023544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:36:48.024045 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.023626 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:36:48.026170 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.026148 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fe006aa3-2754-4ceb-acc9-c8189d25053b-metrics-tls\") pod \"dns-default-kj9w5\" (UID: \"fe006aa3-2754-4ceb-acc9-c8189d25053b\") " pod="openshift-dns/dns-default-kj9w5" Apr 17 14:36:48.026280 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.026263 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76590649-d620-489b-9a3c-5c78ec32d35e-cert\") pod \"ingress-canary-87v7h\" (UID: \"76590649-d620-489b-9a3c-5c78ec32d35e\") " pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:36:48.324577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.324492 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dtcwn\"" Apr 17 14:36:48.325324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.325302 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fr4dp\"" Apr 17 14:36:48.326462 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.326442 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:48.329024 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.329004 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4mfpc\" (UID: \"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:48.332817 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.332777 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87v7h" Apr 17 14:36:48.332902 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.332880 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kj9w5" Apr 17 14:36:48.463141 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.463102 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kj9w5"] Apr 17 14:36:48.466467 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:48.466440 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe006aa3_2754_4ceb_acc9_c8189d25053b.slice/crio-ebb07adb893df6cf1c8ffcf2b8c255b53b4d7f0b9d1731c4ba48f64c9f2ca232 WatchSource:0}: Error finding container ebb07adb893df6cf1c8ffcf2b8c255b53b4d7f0b9d1731c4ba48f64c9f2ca232: Status 404 returned error can't find the container with id ebb07adb893df6cf1c8ffcf2b8c255b53b4d7f0b9d1731c4ba48f64c9f2ca232 Apr 17 14:36:48.488865 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.488830 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-87v7h"] Apr 17 14:36:48.492509 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:48.492482 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76590649_d620_489b_9a3c_5c78ec32d35e.slice/crio-11216d1b85e1d7de8e1ae78f59c155cc8dad203daa602621d89ade3ba9e07e7e WatchSource:0}: Error finding container 11216d1b85e1d7de8e1ae78f59c155cc8dad203daa602621d89ade3ba9e07e7e: Status 404 returned error can't find the container with id 11216d1b85e1d7de8e1ae78f59c155cc8dad203daa602621d89ade3ba9e07e7e Apr 17 14:36:48.517223 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.517189 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" Apr 17 14:36:48.634678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:48.634647 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc"] Apr 17 14:36:48.637858 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:48.637832 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ccbfbc4_5994_4305_ad57_c1b4c21d6b7e.slice/crio-9708f367653772e851e64615c4175bfac8334acf7c70932f792f0d0cb6b07790 WatchSource:0}: Error finding container 9708f367653772e851e64615c4175bfac8334acf7c70932f792f0d0cb6b07790: Status 404 returned error can't find the container with id 9708f367653772e851e64615c4175bfac8334acf7c70932f792f0d0cb6b07790 Apr 17 14:36:49.247588 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:49.247535 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-87v7h" event={"ID":"76590649-d620-489b-9a3c-5c78ec32d35e","Type":"ContainerStarted","Data":"11216d1b85e1d7de8e1ae78f59c155cc8dad203daa602621d89ade3ba9e07e7e"} Apr 17 14:36:49.248735 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:49.248694 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kj9w5" event={"ID":"fe006aa3-2754-4ceb-acc9-c8189d25053b","Type":"ContainerStarted","Data":"ebb07adb893df6cf1c8ffcf2b8c255b53b4d7f0b9d1731c4ba48f64c9f2ca232"} Apr 17 14:36:49.249845 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:49.249816 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" event={"ID":"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e","Type":"ContainerStarted","Data":"9708f367653772e851e64615c4175bfac8334acf7c70932f792f0d0cb6b07790"} Apr 17 14:36:51.914091 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:51.914052 2582 patch_prober.go:28] interesting pod/image-registry-7b97fb5868-crdxs container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:36:51.914556 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:51.914112 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" podUID="32387133-ec4d-425b-902f-441e1eccd234" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:36:51.999963 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:51.999927 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd"] Apr 17 14:36:52.003348 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.003327 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:52.005653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.005624 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 14:36:52.005779 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.005677 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-h2zws\"" Apr 17 14:36:52.009789 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.009767 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd"] Apr 17 14:36:52.061224 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.061190 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9d8aa6c7-c009-4909-982d-1c27652a9903-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lsckd\" (UID: \"9d8aa6c7-c009-4909-982d-1c27652a9903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:52.162424 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.162385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9d8aa6c7-c009-4909-982d-1c27652a9903-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lsckd\" (UID: \"9d8aa6c7-c009-4909-982d-1c27652a9903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:52.162615 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:52.162529 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 14:36:52.162615 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:52.162597 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d8aa6c7-c009-4909-982d-1c27652a9903-tls-certificates podName:9d8aa6c7-c009-4909-982d-1c27652a9903 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:52.662581419 +0000 UTC m=+165.529967053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9d8aa6c7-c009-4909-982d-1c27652a9903-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-lsckd" (UID: "9d8aa6c7-c009-4909-982d-1c27652a9903") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 14:36:52.259480 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.259377 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" event={"ID":"1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e","Type":"ContainerStarted","Data":"88b3e747eae367c41ed9566491470bc5567c16d3726d69c0d1287a6c3bc330ec"} Apr 17 14:36:52.260952 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.260920 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-87v7h" event={"ID":"76590649-d620-489b-9a3c-5c78ec32d35e","Type":"ContainerStarted","Data":"df5ef88188049fea16cca58a8313b71a9e4f32da3cbcc5e3a433bb27401bdbdd"} Apr 17 14:36:52.262534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.262501 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kj9w5" event={"ID":"fe006aa3-2754-4ceb-acc9-c8189d25053b","Type":"ContainerStarted","Data":"23ad9a3b091e9a603601742fd202f37fc581a610df774dcf58340770208c8144"} Apr 17 14:36:52.262534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.262534 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kj9w5" event={"ID":"fe006aa3-2754-4ceb-acc9-c8189d25053b","Type":"ContainerStarted","Data":"6238a6883508a82d94ef584325c9c421cef08b914f842f85303e35c702a57c39"} Apr 17 14:36:52.262686 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.262648 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kj9w5" Apr 17 14:36:52.273877 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.273629 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4mfpc" podStartSLOduration=33.43615299 podStartE2EDuration="36.27361358s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:48.639644522 +0000 UTC m=+161.507030151" lastFinishedPulling="2026-04-17 14:36:51.477105109 +0000 UTC m=+164.344490741" observedRunningTime="2026-04-17 14:36:52.273545385 +0000 UTC m=+165.140931037" watchObservedRunningTime="2026-04-17 14:36:52.27361358 +0000 UTC m=+165.140999229" Apr 17 14:36:52.288776 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.288711 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kj9w5" podStartSLOduration=130.28354185 podStartE2EDuration="2m13.288692354s" podCreationTimestamp="2026-04-17 14:34:39 +0000 UTC" firstStartedPulling="2026-04-17 14:36:48.468441776 +0000 UTC m=+161.335827404" lastFinishedPulling="2026-04-17 14:36:51.473592276 +0000 UTC m=+164.340977908" observedRunningTime="2026-04-17 14:36:52.288448881 +0000 UTC m=+165.155834532" watchObservedRunningTime="2026-04-17 14:36:52.288692354 +0000 UTC m=+165.156078006" Apr 17 14:36:52.301964 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.301918 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-87v7h" podStartSLOduration=130.321245838 podStartE2EDuration="2m13.301902293s" podCreationTimestamp="2026-04-17 14:34:39 +0000 UTC" firstStartedPulling="2026-04-17 14:36:48.49442757 +0000 UTC m=+161.361813201" lastFinishedPulling="2026-04-17 14:36:51.475084024 +0000 UTC m=+164.342469656" observedRunningTime="2026-04-17 14:36:52.300816675 +0000 UTC m=+165.168202321" watchObservedRunningTime="2026-04-17 14:36:52.301902293 +0000 UTC m=+165.169287953" Apr 17 14:36:52.667111 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.667013 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9d8aa6c7-c009-4909-982d-1c27652a9903-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lsckd\" (UID: \"9d8aa6c7-c009-4909-982d-1c27652a9903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:52.669618 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.669597 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9d8aa6c7-c009-4909-982d-1c27652a9903-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-lsckd\" (UID: \"9d8aa6c7-c009-4909-982d-1c27652a9903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:52.913069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:52.913029 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:53.030103 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:53.030068 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd"] Apr 17 14:36:53.033196 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:53.033167 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8aa6c7_c009_4909_982d_1c27652a9903.slice/crio-716a75ba702e55e788ac41a1f49f378b02908b83efcad706a4285619b4425db1 WatchSource:0}: Error finding container 716a75ba702e55e788ac41a1f49f378b02908b83efcad706a4285619b4425db1: Status 404 returned error can't find the container with id 716a75ba702e55e788ac41a1f49f378b02908b83efcad706a4285619b4425db1 Apr 17 14:36:53.266782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:53.266744 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" event={"ID":"9d8aa6c7-c009-4909-982d-1c27652a9903","Type":"ContainerStarted","Data":"716a75ba702e55e788ac41a1f49f378b02908b83efcad706a4285619b4425db1"} Apr 17 14:36:54.271062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:54.270981 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" event={"ID":"9d8aa6c7-c009-4909-982d-1c27652a9903","Type":"ContainerStarted","Data":"2cf074c75c06d3f6d11477504e356e29b34415cdc8a8a839dd2aee5bd1f82313"} Apr 17 14:36:54.271495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:54.271204 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:54.276821 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:54.276776 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" Apr 17 14:36:54.284771 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:54.284701 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-lsckd" podStartSLOduration=2.245309951 podStartE2EDuration="3.284683571s" podCreationTimestamp="2026-04-17 14:36:51 +0000 UTC" firstStartedPulling="2026-04-17 14:36:53.035566882 +0000 UTC m=+165.902952526" lastFinishedPulling="2026-04-17 14:36:54.074940516 +0000 UTC m=+166.942326146" observedRunningTime="2026-04-17 14:36:54.284365964 +0000 UTC m=+167.151751614" watchObservedRunningTime="2026-04-17 14:36:54.284683571 +0000 UTC m=+167.152069223" Apr 17 14:36:55.059225 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.059187 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bzgd8"] Apr 17 14:36:55.062779 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.062756 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.066673 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.066652 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-2hdx8\"" Apr 17 14:36:55.067438 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.067418 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 14:36:55.067680 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.067657 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 14:36:55.067767 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.067734 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:36:55.085129 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.085103 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bzgd8"] Apr 17 14:36:55.190482 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.190446 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.190657 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.190492 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.190657 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.190527 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5881f6c9-b857-44e4-b059-69f64682df56-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.190657 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.190620 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-499k2\" (UniqueName: \"kubernetes.io/projected/5881f6c9-b857-44e4-b059-69f64682df56-kube-api-access-499k2\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.291933 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.291898 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-499k2\" (UniqueName: \"kubernetes.io/projected/5881f6c9-b857-44e4-b059-69f64682df56-kube-api-access-499k2\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.292288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.291941 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.292288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.291972 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.292288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.292011 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5881f6c9-b857-44e4-b059-69f64682df56-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.292288 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:55.292117 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 14:36:55.292288 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:55.292185 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-tls podName:5881f6c9-b857-44e4-b059-69f64682df56 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:55.792166831 +0000 UTC m=+168.659552478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-bzgd8" (UID: "5881f6c9-b857-44e4-b059-69f64682df56") : secret "prometheus-operator-tls" not found Apr 17 14:36:55.292666 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.292646 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5881f6c9-b857-44e4-b059-69f64682df56-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.294553 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.294529 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.302728 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.302706 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-499k2\" (UniqueName: \"kubernetes.io/projected/5881f6c9-b857-44e4-b059-69f64682df56-kube-api-access-499k2\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.738983 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.738943 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:36:55.796609 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.796567 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.799276 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.799251 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5881f6c9-b857-44e4-b059-69f64682df56-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bzgd8\" (UID: \"5881f6c9-b857-44e4-b059-69f64682df56\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:55.973682 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:55.973634 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" Apr 17 14:36:56.098055 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:56.098019 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bzgd8"] Apr 17 14:36:56.100674 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:36:56.100639 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5881f6c9_b857_44e4_b059_69f64682df56.slice/crio-4deb5361952d05609dc34e408dc53e6cf54480dcf4e5cef258a8b4772f529060 WatchSource:0}: Error finding container 4deb5361952d05609dc34e408dc53e6cf54480dcf4e5cef258a8b4772f529060: Status 404 returned error can't find the container with id 4deb5361952d05609dc34e408dc53e6cf54480dcf4e5cef258a8b4772f529060 Apr 17 14:36:56.277658 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:56.277573 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" event={"ID":"5881f6c9-b857-44e4-b059-69f64682df56","Type":"ContainerStarted","Data":"4deb5361952d05609dc34e408dc53e6cf54480dcf4e5cef258a8b4772f529060"} Apr 17 14:36:58.289643 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:58.289598 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" event={"ID":"5881f6c9-b857-44e4-b059-69f64682df56","Type":"ContainerStarted","Data":"bf95d4545ed9e91349412524aab99037ff21d686eb5f34aa4a3660df09b69787"} Apr 17 14:36:58.289643 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:58.289640 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" event={"ID":"5881f6c9-b857-44e4-b059-69f64682df56","Type":"ContainerStarted","Data":"a35839d7d34962e503031eb2c3c9c72641402b50dcdf019f787c595f49a55131"} Apr 17 14:36:58.306526 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:58.306480 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-bzgd8" podStartSLOduration=1.927838471 podStartE2EDuration="3.306463322s" podCreationTimestamp="2026-04-17 14:36:55 +0000 UTC" firstStartedPulling="2026-04-17 14:36:56.10262688 +0000 UTC m=+168.970012511" lastFinishedPulling="2026-04-17 14:36:57.481251729 +0000 UTC m=+170.348637362" observedRunningTime="2026-04-17 14:36:58.305125228 +0000 UTC m=+171.172510879" watchObservedRunningTime="2026-04-17 14:36:58.306463322 +0000 UTC m=+171.173849019" Apr 17 14:36:59.738726 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:36:59.738692 2582 scope.go:117] "RemoveContainer" containerID="7a3939483d16566b5a94d8d09608ae7ae530efb345bc06f409479481a8e52166" Apr 17 14:36:59.739206 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:36:59.738962 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-bss6z_openshift-console-operator(8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podUID="8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2" Apr 17 14:37:00.422611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.422575 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h9wzt"] Apr 17 14:37:00.426604 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.426581 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.429208 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.429180 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:37:00.429526 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.429507 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:37:00.429772 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.429754 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:37:00.430024 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.429766 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zfm9f\"" Apr 17 14:37:00.537790 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.537747 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-textfile\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.537790 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.537828 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab6693d0-6bf3-4a27-be94-c542f254b76b-metrics-client-ca\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.537859 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnm6w\" (UniqueName: \"kubernetes.io/projected/ab6693d0-6bf3-4a27-be94-c542f254b76b-kube-api-access-nnm6w\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.537918 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-root\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.537948 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-accelerators-collector-config\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.537988 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-wtmp\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.538015 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.538044 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-sys\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.538357 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.538077 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-tls\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639434 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639394 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-tls\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639445 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-textfile\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639490 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab6693d0-6bf3-4a27-be94-c542f254b76b-metrics-client-ca\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639514 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnm6w\" (UniqueName: \"kubernetes.io/projected/ab6693d0-6bf3-4a27-be94-c542f254b76b-kube-api-access-nnm6w\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639554 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-root\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639612 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:37:00.639568 2582 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 14:37:00.639612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639579 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-accelerators-collector-config\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639621 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-wtmp\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639945 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:37:00.639641 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-tls podName:ab6693d0-6bf3-4a27-be94-c542f254b76b nodeName:}" failed. No retries permitted until 2026-04-17 14:37:01.139620486 +0000 UTC m=+174.007006132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-tls") pod "node-exporter-h9wzt" (UID: "ab6693d0-6bf3-4a27-be94-c542f254b76b") : secret "node-exporter-tls" not found Apr 17 14:37:00.639945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639678 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639704 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-sys\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639755 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-wtmp\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.639945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639776 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-sys\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.640222 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639944 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ab6693d0-6bf3-4a27-be94-c542f254b76b-root\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.640222 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.639967 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-textfile\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.640222 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.640177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab6693d0-6bf3-4a27-be94-c542f254b76b-metrics-client-ca\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.640465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.640446 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-accelerators-collector-config\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.642551 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.642526 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:00.648248 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:00.648218 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnm6w\" (UniqueName: \"kubernetes.io/projected/ab6693d0-6bf3-4a27-be94-c542f254b76b-kube-api-access-nnm6w\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:01.145019 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:01.144981 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-tls\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:01.147545 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:01.147515 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ab6693d0-6bf3-4a27-be94-c542f254b76b-node-exporter-tls\") pod \"node-exporter-h9wzt\" (UID: \"ab6693d0-6bf3-4a27-be94-c542f254b76b\") " pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:01.338135 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:01.338093 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h9wzt" Apr 17 14:37:01.348873 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:37:01.348835 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6693d0_6bf3_4a27_be94_c542f254b76b.slice/crio-bfbab2c322f541113294e0ca0d72f70f5da289cacc782bdbcc1520a9dc8e02bc WatchSource:0}: Error finding container bfbab2c322f541113294e0ca0d72f70f5da289cacc782bdbcc1520a9dc8e02bc: Status 404 returned error can't find the container with id bfbab2c322f541113294e0ca0d72f70f5da289cacc782bdbcc1520a9dc8e02bc Apr 17 14:37:01.913660 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:01.913627 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:37:02.269475 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.269450 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kj9w5" Apr 17 14:37:02.307546 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.307508 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h9wzt" event={"ID":"ab6693d0-6bf3-4a27-be94-c542f254b76b","Type":"ContainerStarted","Data":"5704bac32d653d1aabb4d6507c6207d33e30c3fd06957c4a00acbd8a9309c794"} Apr 17 14:37:02.307546 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.307554 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h9wzt" event={"ID":"ab6693d0-6bf3-4a27-be94-c542f254b76b","Type":"ContainerStarted","Data":"bfbab2c322f541113294e0ca0d72f70f5da289cacc782bdbcc1520a9dc8e02bc"} Apr 17 14:37:02.479491 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.479406 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9"] Apr 17 14:37:02.482888 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.482867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.485490 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.485466 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 14:37:02.485618 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.485561 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-45gk2\"" Apr 17 14:37:02.485723 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.485707 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 14:37:02.486123 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.486105 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 14:37:02.486289 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.486251 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d58artunin4j7\"" Apr 17 14:37:02.486289 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.486261 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 14:37:02.486488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.486475 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 14:37:02.499174 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.499152 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9"] Apr 17 14:37:02.661475 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661443 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661485 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661515 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfsrx\" (UniqueName: \"kubernetes.io/projected/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-kube-api-access-vfsrx\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661586 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661626 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-tls\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661647 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-grpc-tls\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661710 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.661773 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.661765 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-metrics-client-ca\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.762975 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.762940 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfsrx\" (UniqueName: \"kubernetes.io/projected/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-kube-api-access-vfsrx\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763158 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.762987 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763158 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.763011 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-tls\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763158 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.763051 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-grpc-tls\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763158 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.763114 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763158 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.763156 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-metrics-client-ca\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.763185 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.763405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.763220 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.764138 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.764078 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-metrics-client-ca\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.765701 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.765677 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.765960 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.765931 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-tls\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.766209 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.766190 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-grpc-tls\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.766672 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.766653 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.766742 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.766694 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.766792 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.766762 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.769955 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.769935 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfsrx\" (UniqueName: \"kubernetes.io/projected/1f793873-40a8-4eec-908a-2d5e2bdf7aa9-kube-api-access-vfsrx\") pod \"thanos-querier-5fc8496d4f-vbjg9\" (UID: \"1f793873-40a8-4eec-908a-2d5e2bdf7aa9\") " pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.792905 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.792876 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:02.923877 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:02.923844 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9"] Apr 17 14:37:02.927416 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:37:02.927387 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f793873_40a8_4eec_908a_2d5e2bdf7aa9.slice/crio-01f82b6bcaf050f9425c567eb6610c98f38cdb9a726053dd4e4495cce9f2169c WatchSource:0}: Error finding container 01f82b6bcaf050f9425c567eb6610c98f38cdb9a726053dd4e4495cce9f2169c: Status 404 returned error can't find the container with id 01f82b6bcaf050f9425c567eb6610c98f38cdb9a726053dd4e4495cce9f2169c Apr 17 14:37:03.312160 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:03.312122 2582 generic.go:358] "Generic (PLEG): container finished" podID="ab6693d0-6bf3-4a27-be94-c542f254b76b" containerID="5704bac32d653d1aabb4d6507c6207d33e30c3fd06957c4a00acbd8a9309c794" exitCode=0 Apr 17 14:37:03.312583 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:03.312199 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h9wzt" event={"ID":"ab6693d0-6bf3-4a27-be94-c542f254b76b","Type":"ContainerDied","Data":"5704bac32d653d1aabb4d6507c6207d33e30c3fd06957c4a00acbd8a9309c794"} Apr 17 14:37:03.313356 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:03.313332 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"01f82b6bcaf050f9425c567eb6610c98f38cdb9a726053dd4e4495cce9f2169c"} Apr 17 14:37:04.319408 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:04.319368 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h9wzt" event={"ID":"ab6693d0-6bf3-4a27-be94-c542f254b76b","Type":"ContainerStarted","Data":"1dfb4e4ed017eed42ab7534f9bbd479cc8f668fd9d62bba4ad7a60e50190eb82"} Apr 17 14:37:04.319936 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:04.319436 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h9wzt" event={"ID":"ab6693d0-6bf3-4a27-be94-c542f254b76b","Type":"ContainerStarted","Data":"0947268ff2efebcd67b0083ff4c28a8de19aee3ef5c45339c3c52047e6b95e3c"} Apr 17 14:37:04.354876 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:04.354824 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h9wzt" podStartSLOduration=3.519736708 podStartE2EDuration="4.354781251s" podCreationTimestamp="2026-04-17 14:37:00 +0000 UTC" firstStartedPulling="2026-04-17 14:37:01.351270297 +0000 UTC m=+174.218655934" lastFinishedPulling="2026-04-17 14:37:02.186314835 +0000 UTC m=+175.053700477" observedRunningTime="2026-04-17 14:37:04.354191781 +0000 UTC m=+177.221577434" watchObservedRunningTime="2026-04-17 14:37:04.354781251 +0000 UTC m=+177.222166901" Apr 17 14:37:05.324384 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:05.324349 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"a96c9969e9a6a7d9f4f992052f83a3ac4099a50983628ccfd33d354342772c2e"} Apr 17 14:37:05.324384 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:05.324384 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"25e3d782eac6ea32d89c30131f86ba170508391d147c2523a6043b15e561d489"} Apr 17 14:37:05.324784 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:05.324395 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"49f4555710f13b29a5f95769daf01f304b7c5891d36bfe028fdc097eb5404bd8"} Apr 17 14:37:06.330696 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:06.330658 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"3880a3e4472dc3656e439818eeef75c5d1ba1867a41e4bb6af4af722b015c22b"} Apr 17 14:37:06.330696 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:06.330700 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"195c0a4a8dd5b2628407f489c051d8e26606bf001a248d19b9c618dba0aa821c"} Apr 17 14:37:06.331145 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:06.330713 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" event={"ID":"1f793873-40a8-4eec-908a-2d5e2bdf7aa9","Type":"ContainerStarted","Data":"72b81c190d5abf239d5466ba2e90de68eaf2c7deb1d1379a017bb2a14432b23b"} Apr 17 14:37:06.331145 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:06.330837 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:06.358944 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:06.358876 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" podStartSLOduration=1.38300039 podStartE2EDuration="4.358858068s" podCreationTimestamp="2026-04-17 14:37:02 +0000 UTC" firstStartedPulling="2026-04-17 14:37:02.929289201 +0000 UTC m=+175.796674832" lastFinishedPulling="2026-04-17 14:37:05.90514688 +0000 UTC m=+178.772532510" observedRunningTime="2026-04-17 14:37:06.357315493 +0000 UTC m=+179.224701149" watchObservedRunningTime="2026-04-17 14:37:06.358858068 +0000 UTC m=+179.226243719" Apr 17 14:37:06.927732 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:06.927684 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" podUID="32387133-ec4d-425b-902f-441e1eccd234" containerName="registry" containerID="cri-o://ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9" gracePeriod=30 Apr 17 14:37:07.162259 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.162228 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:37:07.307281 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307241 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-trusted-ca\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307485 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307293 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32387133-ec4d-425b-902f-441e1eccd234-ca-trust-extracted\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307485 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307319 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307485 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307358 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-registry-certificates\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307485 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307416 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-installation-pull-secrets\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307485 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307455 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-image-registry-private-configuration\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307738 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307501 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6vmk\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-kube-api-access-l6vmk\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307738 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307531 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-bound-sa-token\") pod \"32387133-ec4d-425b-902f-441e1eccd234\" (UID: \"32387133-ec4d-425b-902f-441e1eccd234\") " Apr 17 14:37:07.307881 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.307844 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:37:07.308062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.308034 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:37:07.310138 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.310105 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:37:07.310138 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.310127 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:37:07.310359 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.310330 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-kube-api-access-l6vmk" (OuterVolumeSpecName: "kube-api-access-l6vmk") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "kube-api-access-l6vmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:37:07.310466 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.310391 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:37:07.310640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.310623 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:37:07.316157 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.316130 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32387133-ec4d-425b-902f-441e1eccd234-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "32387133-ec4d-425b-902f-441e1eccd234" (UID: "32387133-ec4d-425b-902f-441e1eccd234"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:37:07.334204 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.334176 2582 generic.go:358] "Generic (PLEG): container finished" podID="32387133-ec4d-425b-902f-441e1eccd234" containerID="ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9" exitCode=0 Apr 17 14:37:07.334530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.334234 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" Apr 17 14:37:07.334530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.334257 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" event={"ID":"32387133-ec4d-425b-902f-441e1eccd234","Type":"ContainerDied","Data":"ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9"} Apr 17 14:37:07.334530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.334297 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97fb5868-crdxs" event={"ID":"32387133-ec4d-425b-902f-441e1eccd234","Type":"ContainerDied","Data":"f43e468af173a742b828bd57f0c69123a8bb11a3c5afcdf0e89e0b8655b06e50"} Apr 17 14:37:07.334530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.334315 2582 scope.go:117] "RemoveContainer" containerID="ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9" Apr 17 14:37:07.342495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.342476 2582 scope.go:117] "RemoveContainer" containerID="ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9" Apr 17 14:37:07.342753 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:37:07.342737 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9\": container with ID starting with ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9 not found: ID does not exist" containerID="ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9" Apr 17 14:37:07.342826 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.342760 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9"} err="failed to get container status \"ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9\": rpc error: code = NotFound desc = could not find container \"ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9\": container with ID starting with ba0c11266cf95728e72f5014da033e9037d33d2b085ad471358e0e297a6a92b9 not found: ID does not exist" Apr 17 14:37:07.353400 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.353374 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b97fb5868-crdxs"] Apr 17 14:37:07.358727 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.358703 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7b97fb5868-crdxs"] Apr 17 14:37:07.408279 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408242 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-registry-tls\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408279 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408280 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-registry-certificates\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408298 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-installation-pull-secrets\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408320 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/32387133-ec4d-425b-902f-441e1eccd234-image-registry-private-configuration\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408336 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l6vmk\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-kube-api-access-l6vmk\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408352 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32387133-ec4d-425b-902f-441e1eccd234-bound-sa-token\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408366 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32387133-ec4d-425b-902f-441e1eccd234-trusted-ca\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.408465 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.408380 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32387133-ec4d-425b-902f-441e1eccd234-ca-trust-extracted\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:37:07.743029 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:07.742995 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32387133-ec4d-425b-902f-441e1eccd234" path="/var/lib/kubelet/pods/32387133-ec4d-425b-902f-441e1eccd234/volumes" Apr 17 14:37:12.341379 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:12.341346 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5fc8496d4f-vbjg9" Apr 17 14:37:13.738735 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:13.738703 2582 scope.go:117] "RemoveContainer" containerID="7a3939483d16566b5a94d8d09608ae7ae530efb345bc06f409479481a8e52166" Apr 17 14:37:14.358490 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.358464 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:37:14.358661 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.358513 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" event={"ID":"8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2","Type":"ContainerStarted","Data":"cc02c368dfefafb4095c219c15bb4e011c4a6b920eb056060fd65e82277e72a5"} Apr 17 14:37:14.358796 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.358780 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:37:14.376200 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.376139 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" podStartSLOduration=54.986275881 podStartE2EDuration="58.376121327s" podCreationTimestamp="2026-04-17 14:36:16 +0000 UTC" firstStartedPulling="2026-04-17 14:36:16.757552375 +0000 UTC m=+129.624938008" lastFinishedPulling="2026-04-17 14:36:20.147397807 +0000 UTC m=+133.014783454" observedRunningTime="2026-04-17 14:37:14.374470598 +0000 UTC m=+187.241856246" watchObservedRunningTime="2026-04-17 14:37:14.376121327 +0000 UTC m=+187.243506979" Apr 17 14:37:14.666994 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.666913 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-bss6z" Apr 17 14:37:14.835402 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.835361 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rgv47"] Apr 17 14:37:14.835746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.835684 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32387133-ec4d-425b-902f-441e1eccd234" containerName="registry" Apr 17 14:37:14.835746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.835695 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="32387133-ec4d-425b-902f-441e1eccd234" containerName="registry" Apr 17 14:37:14.835746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.835744 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="32387133-ec4d-425b-902f-441e1eccd234" containerName="registry" Apr 17 14:37:14.838541 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.838520 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:14.840937 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.840914 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:37:14.841059 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.840913 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:37:14.841059 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.840960 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lcfcf\"" Apr 17 14:37:14.848689 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.848668 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rgv47"] Apr 17 14:37:14.865621 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.865591 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwm8n\" (UniqueName: \"kubernetes.io/projected/5ed0f5e3-c06f-4fe3-9938-13df41a47562-kube-api-access-kwm8n\") pod \"downloads-6bcc868b7-rgv47\" (UID: \"5ed0f5e3-c06f-4fe3-9938-13df41a47562\") " pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:14.966832 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.966715 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwm8n\" (UniqueName: \"kubernetes.io/projected/5ed0f5e3-c06f-4fe3-9938-13df41a47562-kube-api-access-kwm8n\") pod \"downloads-6bcc868b7-rgv47\" (UID: \"5ed0f5e3-c06f-4fe3-9938-13df41a47562\") " pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:14.974288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:14.974258 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwm8n\" (UniqueName: \"kubernetes.io/projected/5ed0f5e3-c06f-4fe3-9938-13df41a47562-kube-api-access-kwm8n\") pod \"downloads-6bcc868b7-rgv47\" (UID: \"5ed0f5e3-c06f-4fe3-9938-13df41a47562\") " pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:15.148071 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:15.148035 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:15.268214 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:15.268174 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rgv47"] Apr 17 14:37:15.271435 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:37:15.271405 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed0f5e3_c06f_4fe3_9938_13df41a47562.slice/crio-b66acfad13999a9c04287f81bbd43c58b3eb5b53a762e52b63f80fc527ccdf37 WatchSource:0}: Error finding container b66acfad13999a9c04287f81bbd43c58b3eb5b53a762e52b63f80fc527ccdf37: Status 404 returned error can't find the container with id b66acfad13999a9c04287f81bbd43c58b3eb5b53a762e52b63f80fc527ccdf37 Apr 17 14:37:15.362359 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:15.362318 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rgv47" event={"ID":"5ed0f5e3-c06f-4fe3-9938-13df41a47562","Type":"ContainerStarted","Data":"b66acfad13999a9c04287f81bbd43c58b3eb5b53a762e52b63f80fc527ccdf37"} Apr 17 14:37:20.567253 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.567217 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8445d78569-q6kl7"] Apr 17 14:37:20.571903 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.571884 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.577324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.577296 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:37:20.577492 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.577304 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:37:20.578414 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.578383 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:37:20.578504 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.578395 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lw4t2\"" Apr 17 14:37:20.578504 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.578470 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:37:20.585412 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.585389 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8445d78569-q6kl7"] Apr 17 14:37:20.589531 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.589510 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:37:20.614360 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.614325 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-service-ca\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.614542 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.614373 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-serving-cert\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.614542 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.614391 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-oauth-config\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.614542 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.614406 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-oauth-serving-cert\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.614542 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.614503 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9spw\" (UniqueName: \"kubernetes.io/projected/f045d01e-2791-48e3-852c-b050172fb4e3-kube-api-access-g9spw\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.614716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.614546 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-console-config\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.715783 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.715747 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-console-config\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.715986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.715871 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-service-ca\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.715986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.715914 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-serving-cert\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.715986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.715941 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-oauth-config\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.715986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.715968 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-oauth-serving-cert\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.716198 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.716015 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9spw\" (UniqueName: \"kubernetes.io/projected/f045d01e-2791-48e3-852c-b050172fb4e3-kube-api-access-g9spw\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.716634 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.716582 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-console-config\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.716928 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.716721 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-oauth-serving-cert\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.716993 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.716949 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-service-ca\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.718704 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.718679 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-oauth-config\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.718865 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.718847 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-serving-cert\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.724283 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.724261 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9spw\" (UniqueName: \"kubernetes.io/projected/f045d01e-2791-48e3-852c-b050172fb4e3-kube-api-access-g9spw\") pod \"console-8445d78569-q6kl7\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:20.881552 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:20.881456 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:21.022222 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:21.022196 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8445d78569-q6kl7"] Apr 17 14:37:21.025002 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:37:21.024963 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf045d01e_2791_48e3_852c_b050172fb4e3.slice/crio-ff5e300be668cc01cfe2dc7971ba02fa9cb011143fdedf4b716d8ff0fdc1c52a WatchSource:0}: Error finding container ff5e300be668cc01cfe2dc7971ba02fa9cb011143fdedf4b716d8ff0fdc1c52a: Status 404 returned error can't find the container with id ff5e300be668cc01cfe2dc7971ba02fa9cb011143fdedf4b716d8ff0fdc1c52a Apr 17 14:37:21.387084 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:21.387040 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8445d78569-q6kl7" event={"ID":"f045d01e-2791-48e3-852c-b050172fb4e3","Type":"ContainerStarted","Data":"ff5e300be668cc01cfe2dc7971ba02fa9cb011143fdedf4b716d8ff0fdc1c52a"} Apr 17 14:37:25.401081 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:25.401044 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8445d78569-q6kl7" event={"ID":"f045d01e-2791-48e3-852c-b050172fb4e3","Type":"ContainerStarted","Data":"3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3"} Apr 17 14:37:25.418524 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:25.418460 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8445d78569-q6kl7" podStartSLOduration=2.114309822 podStartE2EDuration="5.418445192s" podCreationTimestamp="2026-04-17 14:37:20 +0000 UTC" firstStartedPulling="2026-04-17 14:37:21.027336689 +0000 UTC m=+193.894722318" lastFinishedPulling="2026-04-17 14:37:24.331472052 +0000 UTC m=+197.198857688" observedRunningTime="2026-04-17 14:37:25.417326402 +0000 UTC m=+198.284712055" watchObservedRunningTime="2026-04-17 14:37:25.418445192 +0000 UTC m=+198.285830843" Apr 17 14:37:30.882337 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:30.882238 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:30.882337 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:30.882321 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:30.888257 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:30.888226 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:31.424699 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:31.424672 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:37:34.431478 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.431427 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rgv47" event={"ID":"5ed0f5e3-c06f-4fe3-9938-13df41a47562","Type":"ContainerStarted","Data":"61f95178f821eb7fd0ccee5092d571ee0e54bb1cf5350647930789bc6c3ae4b1"} Apr 17 14:37:34.432017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.431990 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:34.433066 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.433035 2582 generic.go:358] "Generic (PLEG): container finished" podID="c73897be-24cb-49ee-a735-e2c35eb461f4" containerID="185f47a56f260d76058c154b7396d3d408787c493b43b66122d0ab7bfcc9f539" exitCode=0 Apr 17 14:37:34.433207 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.433115 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" event={"ID":"c73897be-24cb-49ee-a735-e2c35eb461f4","Type":"ContainerDied","Data":"185f47a56f260d76058c154b7396d3d408787c493b43b66122d0ab7bfcc9f539"} Apr 17 14:37:34.433548 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.433525 2582 scope.go:117] "RemoveContainer" containerID="185f47a56f260d76058c154b7396d3d408787c493b43b66122d0ab7bfcc9f539" Apr 17 14:37:34.444044 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.444014 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rgv47" Apr 17 14:37:34.448126 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.448051 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rgv47" podStartSLOduration=2.22162873 podStartE2EDuration="20.44803212s" podCreationTimestamp="2026-04-17 14:37:14 +0000 UTC" firstStartedPulling="2026-04-17 14:37:15.273698639 +0000 UTC m=+188.141084269" lastFinishedPulling="2026-04-17 14:37:33.50010203 +0000 UTC m=+206.367487659" observedRunningTime="2026-04-17 14:37:34.447994652 +0000 UTC m=+207.315380315" watchObservedRunningTime="2026-04-17 14:37:34.44803212 +0000 UTC m=+207.315417773" Apr 17 14:37:34.822681 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:34.822646 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-87v7h_76590649-d620-489b-9a3c-5c78ec32d35e/serve-healthcheck-canary/0.log" Apr 17 14:37:35.438554 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:35.438511 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9kpqq" event={"ID":"c73897be-24cb-49ee-a735-e2c35eb461f4","Type":"ContainerStarted","Data":"9fdaa88b1949557f201e592eeb2e9e5d1acae1e4e891ab21453ea4a2640bb030"} Apr 17 14:37:41.458718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:41.458679 2582 generic.go:358] "Generic (PLEG): container finished" podID="84ae5f81-3a81-49e9-a3fb-cad167d0281b" containerID="2a7a50fde6eb370fdc2e1d717865fb5580677affa7cdf3f53e76fa7ef14a75f7" exitCode=0 Apr 17 14:37:41.459259 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:41.458764 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" event={"ID":"84ae5f81-3a81-49e9-a3fb-cad167d0281b","Type":"ContainerDied","Data":"2a7a50fde6eb370fdc2e1d717865fb5580677affa7cdf3f53e76fa7ef14a75f7"} Apr 17 14:37:41.459259 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:41.459198 2582 scope.go:117] "RemoveContainer" containerID="2a7a50fde6eb370fdc2e1d717865fb5580677affa7cdf3f53e76fa7ef14a75f7" Apr 17 14:37:41.973389 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:41.973351 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8445d78569-q6kl7"] Apr 17 14:37:42.464409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:37:42.464371 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qq9b8" event={"ID":"84ae5f81-3a81-49e9-a3fb-cad167d0281b","Type":"ContainerStarted","Data":"f8e9ac137426af1c6e92214b7d542055ea04daecb6518956345970eee353689c"} Apr 17 14:38:06.998699 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:06.998640 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8445d78569-q6kl7" podUID="f045d01e-2791-48e3-852c-b050172fb4e3" containerName="console" containerID="cri-o://3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3" gracePeriod=15 Apr 17 14:38:07.276693 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.276670 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8445d78569-q6kl7_f045d01e-2791-48e3-852c-b050172fb4e3/console/0.log" Apr 17 14:38:07.276837 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.276731 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:38:07.343079 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343048 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9spw\" (UniqueName: \"kubernetes.io/projected/f045d01e-2791-48e3-852c-b050172fb4e3-kube-api-access-g9spw\") pod \"f045d01e-2791-48e3-852c-b050172fb4e3\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " Apr 17 14:38:07.343079 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343084 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-console-config\") pod \"f045d01e-2791-48e3-852c-b050172fb4e3\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " Apr 17 14:38:07.343309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343149 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-service-ca\") pod \"f045d01e-2791-48e3-852c-b050172fb4e3\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " Apr 17 14:38:07.343309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343171 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-serving-cert\") pod \"f045d01e-2791-48e3-852c-b050172fb4e3\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " Apr 17 14:38:07.343309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343194 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-oauth-serving-cert\") pod \"f045d01e-2791-48e3-852c-b050172fb4e3\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " Apr 17 14:38:07.343309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343213 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-oauth-config\") pod \"f045d01e-2791-48e3-852c-b050172fb4e3\" (UID: \"f045d01e-2791-48e3-852c-b050172fb4e3\") " Apr 17 14:38:07.343588 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343557 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-console-config" (OuterVolumeSpecName: "console-config") pod "f045d01e-2791-48e3-852c-b050172fb4e3" (UID: "f045d01e-2791-48e3-852c-b050172fb4e3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:07.343670 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343555 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f045d01e-2791-48e3-852c-b050172fb4e3" (UID: "f045d01e-2791-48e3-852c-b050172fb4e3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:07.343670 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.343564 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "f045d01e-2791-48e3-852c-b050172fb4e3" (UID: "f045d01e-2791-48e3-852c-b050172fb4e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:07.345527 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.345493 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f045d01e-2791-48e3-852c-b050172fb4e3" (UID: "f045d01e-2791-48e3-852c-b050172fb4e3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:07.345527 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.345521 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f045d01e-2791-48e3-852c-b050172fb4e3" (UID: "f045d01e-2791-48e3-852c-b050172fb4e3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:07.345670 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.345587 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f045d01e-2791-48e3-852c-b050172fb4e3-kube-api-access-g9spw" (OuterVolumeSpecName: "kube-api-access-g9spw") pod "f045d01e-2791-48e3-852c-b050172fb4e3" (UID: "f045d01e-2791-48e3-852c-b050172fb4e3"). InnerVolumeSpecName "kube-api-access-g9spw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:38:07.444352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.444312 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-service-ca\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:38:07.444352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.444347 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-serving-cert\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:38:07.444352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.444357 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-oauth-serving-cert\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:38:07.444577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.444366 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f045d01e-2791-48e3-852c-b050172fb4e3-console-oauth-config\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:38:07.444577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.444375 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9spw\" (UniqueName: \"kubernetes.io/projected/f045d01e-2791-48e3-852c-b050172fb4e3-kube-api-access-g9spw\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:38:07.444577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.444385 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f045d01e-2791-48e3-852c-b050172fb4e3-console-config\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:38:07.545049 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.544978 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8445d78569-q6kl7_f045d01e-2791-48e3-852c-b050172fb4e3/console/0.log" Apr 17 14:38:07.545049 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.545022 2582 generic.go:358] "Generic (PLEG): container finished" podID="f045d01e-2791-48e3-852c-b050172fb4e3" containerID="3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3" exitCode=2 Apr 17 14:38:07.545215 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.545052 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8445d78569-q6kl7" event={"ID":"f045d01e-2791-48e3-852c-b050172fb4e3","Type":"ContainerDied","Data":"3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3"} Apr 17 14:38:07.545215 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.545098 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8445d78569-q6kl7" event={"ID":"f045d01e-2791-48e3-852c-b050172fb4e3","Type":"ContainerDied","Data":"ff5e300be668cc01cfe2dc7971ba02fa9cb011143fdedf4b716d8ff0fdc1c52a"} Apr 17 14:38:07.545215 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.545119 2582 scope.go:117] "RemoveContainer" containerID="3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3" Apr 17 14:38:07.545215 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.545121 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8445d78569-q6kl7" Apr 17 14:38:07.553495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.553471 2582 scope.go:117] "RemoveContainer" containerID="3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3" Apr 17 14:38:07.553768 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:38:07.553749 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3\": container with ID starting with 3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3 not found: ID does not exist" containerID="3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3" Apr 17 14:38:07.553848 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.553776 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3"} err="failed to get container status \"3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3\": rpc error: code = NotFound desc = could not find container \"3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3\": container with ID starting with 3f18a30d93235b344c41ab812acc3a887cf640dab4fa7924699508e4f11227b3 not found: ID does not exist" Apr 17 14:38:07.568933 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.568909 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8445d78569-q6kl7"] Apr 17 14:38:07.576725 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.576695 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8445d78569-q6kl7"] Apr 17 14:38:07.742433 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:07.742392 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f045d01e-2791-48e3-852c-b050172fb4e3" path="/var/lib/kubelet/pods/f045d01e-2791-48e3-852c-b050172fb4e3/volumes" Apr 17 14:38:18.537622 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:18.537579 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:38:18.540098 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:18.540065 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbcf40f6-2ec0-4fb3-85d8-30ecb284384d-metrics-certs\") pod \"network-metrics-daemon-4kdjq\" (UID: \"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d\") " pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:38:18.841923 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:18.841835 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tg92k\"" Apr 17 14:38:18.850639 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:18.850618 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kdjq" Apr 17 14:38:18.971898 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:18.971695 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4kdjq"] Apr 17 14:38:18.974419 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:38:18.974393 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbcf40f6_2ec0_4fb3_85d8_30ecb284384d.slice/crio-baffa04710dbaf7a4b62ef7a94d31c5b74afd3dbed6bed12db6587d0af2ebfe2 WatchSource:0}: Error finding container baffa04710dbaf7a4b62ef7a94d31c5b74afd3dbed6bed12db6587d0af2ebfe2: Status 404 returned error can't find the container with id baffa04710dbaf7a4b62ef7a94d31c5b74afd3dbed6bed12db6587d0af2ebfe2 Apr 17 14:38:19.584012 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:19.583968 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4kdjq" event={"ID":"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d","Type":"ContainerStarted","Data":"baffa04710dbaf7a4b62ef7a94d31c5b74afd3dbed6bed12db6587d0af2ebfe2"} Apr 17 14:38:20.588442 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:20.588404 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4kdjq" event={"ID":"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d","Type":"ContainerStarted","Data":"ae7fbc3abc1690aa3e479089ee189c61f0698cd6444687682a2b07a829b0b5ca"} Apr 17 14:38:20.588442 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:20.588441 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4kdjq" event={"ID":"fbcf40f6-2ec0-4fb3-85d8-30ecb284384d","Type":"ContainerStarted","Data":"96a41a7e02d07ea92e5b1d545db3cdb33660f8bbe74d0151bd6eb80db4651bd2"} Apr 17 14:38:20.602779 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:20.602713 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4kdjq" podStartSLOduration=252.454428085 podStartE2EDuration="4m13.602691892s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:38:18.976222737 +0000 UTC m=+251.843608366" lastFinishedPulling="2026-04-17 14:38:20.124486541 +0000 UTC m=+252.991872173" observedRunningTime="2026-04-17 14:38:20.601931663 +0000 UTC m=+253.469317314" watchObservedRunningTime="2026-04-17 14:38:20.602691892 +0000 UTC m=+253.470077544" Apr 17 14:38:29.112968 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.112890 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6986b65754-pghpz"] Apr 17 14:38:29.113401 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.113198 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f045d01e-2791-48e3-852c-b050172fb4e3" containerName="console" Apr 17 14:38:29.113401 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.113208 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="f045d01e-2791-48e3-852c-b050172fb4e3" containerName="console" Apr 17 14:38:29.113401 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.113268 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="f045d01e-2791-48e3-852c-b050172fb4e3" containerName="console" Apr 17 14:38:29.116366 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.116340 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.118830 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.118796 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:38:29.119663 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.119642 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:38:29.119761 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.119676 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:38:29.119761 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.119725 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lw4t2\"" Apr 17 14:38:29.119761 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.119736 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:38:29.119956 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.119898 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:38:29.127762 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.127740 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6986b65754-pghpz"] Apr 17 14:38:29.128040 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.128015 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:38:29.221946 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.221909 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-serving-cert\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.221946 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.221954 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-oauth-config\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.222161 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.222000 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-oauth-serving-cert\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.222161 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.222025 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-config\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.222161 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.222058 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmzz\" (UniqueName: \"kubernetes.io/projected/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-kube-api-access-hwmzz\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.222161 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.222122 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-trusted-ca-bundle\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.222299 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.222161 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-service-ca\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323113 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323080 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-serving-cert\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323113 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323112 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-oauth-config\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323139 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-oauth-serving-cert\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323158 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-config\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323205 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmzz\" (UniqueName: \"kubernetes.io/projected/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-kube-api-access-hwmzz\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323246 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-trusted-ca-bundle\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.323324 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.323279 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-service-ca\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.324085 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.324055 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-config\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.324205 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.324081 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-oauth-serving-cert\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.324205 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.324055 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-service-ca\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.324205 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.324168 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-trusted-ca-bundle\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.325683 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.325660 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-oauth-config\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.325780 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.325751 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-serving-cert\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.330983 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.330954 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmzz\" (UniqueName: \"kubernetes.io/projected/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-kube-api-access-hwmzz\") pod \"console-6986b65754-pghpz\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.434323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.434226 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:29.567885 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.567853 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6986b65754-pghpz"] Apr 17 14:38:29.571087 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:38:29.571057 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a7d9d9_8c8c_4bc5_9081_6a00053e49af.slice/crio-d07e6860b0be994d473c937f8db54293db7326b9e565b28dd243ad0aa91d3821 WatchSource:0}: Error finding container d07e6860b0be994d473c937f8db54293db7326b9e565b28dd243ad0aa91d3821: Status 404 returned error can't find the container with id d07e6860b0be994d473c937f8db54293db7326b9e565b28dd243ad0aa91d3821 Apr 17 14:38:29.619336 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:29.619298 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6986b65754-pghpz" event={"ID":"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af","Type":"ContainerStarted","Data":"d07e6860b0be994d473c937f8db54293db7326b9e565b28dd243ad0aa91d3821"} Apr 17 14:38:30.623618 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:30.623573 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6986b65754-pghpz" event={"ID":"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af","Type":"ContainerStarted","Data":"b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921"} Apr 17 14:38:30.639354 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:30.639309 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6986b65754-pghpz" podStartSLOduration=1.639295013 podStartE2EDuration="1.639295013s" podCreationTimestamp="2026-04-17 14:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:38:30.638868395 +0000 UTC m=+263.506254045" watchObservedRunningTime="2026-04-17 14:38:30.639295013 +0000 UTC m=+263.506680664" Apr 17 14:38:39.434873 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:39.434791 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:39.434873 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:39.434876 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:39.439657 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:39.439632 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:38:39.653512 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:38:39.653487 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:39:07.616900 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:39:07.616872 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:39:07.617421 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:39:07.616872 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:39:07.625929 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:39:07.625905 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:39:47.337182 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:39:47.337139 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6986b65754-pghpz"] Apr 17 14:40:12.363416 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.363319 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6986b65754-pghpz" podUID="a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" containerName="console" containerID="cri-o://b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921" gracePeriod=15 Apr 17 14:40:12.601102 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.601079 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6986b65754-pghpz_a1a7d9d9-8c8c-4bc5-9081-6a00053e49af/console/0.log" Apr 17 14:40:12.601202 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.601139 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:40:12.744318 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744282 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-config\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744331 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-oauth-serving-cert\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744365 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-oauth-config\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744385 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwmzz\" (UniqueName: \"kubernetes.io/projected/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-kube-api-access-hwmzz\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744412 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-serving-cert\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744439 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-trusted-ca-bundle\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744456 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-service-ca\") pod \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\" (UID: \"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af\") " Apr 17 14:40:12.744902 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744780 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-config" (OuterVolumeSpecName: "console-config") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:40:12.744995 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744917 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:40:12.745053 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.744994 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-service-ca" (OuterVolumeSpecName: "service-ca") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:40:12.745094 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.745038 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:40:12.746757 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.746732 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-kube-api-access-hwmzz" (OuterVolumeSpecName: "kube-api-access-hwmzz") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "kube-api-access-hwmzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:40:12.747122 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.747100 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:40:12.747219 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.747200 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" (UID: "a1a7d9d9-8c8c-4bc5-9081-6a00053e49af"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:40:12.845777 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845723 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-serving-cert\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.845777 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845772 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-trusted-ca-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.845777 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845784 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-service-ca\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.845777 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845793 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-config\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.846072 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845829 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-oauth-serving-cert\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.846072 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845838 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-console-oauth-config\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.846072 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.845847 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwmzz\" (UniqueName: \"kubernetes.io/projected/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af-kube-api-access-hwmzz\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:40:12.908351 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.908325 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6986b65754-pghpz_a1a7d9d9-8c8c-4bc5-9081-6a00053e49af/console/0.log" Apr 17 14:40:12.908528 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.908364 2582 generic.go:358] "Generic (PLEG): container finished" podID="a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" containerID="b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921" exitCode=2 Apr 17 14:40:12.908528 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.908409 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6986b65754-pghpz" event={"ID":"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af","Type":"ContainerDied","Data":"b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921"} Apr 17 14:40:12.908528 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.908430 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6986b65754-pghpz" event={"ID":"a1a7d9d9-8c8c-4bc5-9081-6a00053e49af","Type":"ContainerDied","Data":"d07e6860b0be994d473c937f8db54293db7326b9e565b28dd243ad0aa91d3821"} Apr 17 14:40:12.908528 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.908445 2582 scope.go:117] "RemoveContainer" containerID="b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921" Apr 17 14:40:12.908528 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.908445 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6986b65754-pghpz" Apr 17 14:40:12.919100 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.919079 2582 scope.go:117] "RemoveContainer" containerID="b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921" Apr 17 14:40:12.919482 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:40:12.919461 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921\": container with ID starting with b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921 not found: ID does not exist" containerID="b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921" Apr 17 14:40:12.919557 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.919494 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921"} err="failed to get container status \"b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921\": rpc error: code = NotFound desc = could not find container \"b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921\": container with ID starting with b099aebd77ac6176d089cc7a80c3d99b125f9efee42024392e51ecbe670f1921 not found: ID does not exist" Apr 17 14:40:12.930038 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.930007 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6986b65754-pghpz"] Apr 17 14:40:12.933776 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:12.933749 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6986b65754-pghpz"] Apr 17 14:40:13.745847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:13.743598 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" path="/var/lib/kubelet/pods/a1a7d9d9-8c8c-4bc5-9081-6a00053e49af/volumes" Apr 17 14:40:29.231280 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.231242 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z82rb"] Apr 17 14:40:29.231745 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.231541 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" containerName="console" Apr 17 14:40:29.231745 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.231552 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" containerName="console" Apr 17 14:40:29.231745 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.231605 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1a7d9d9-8c8c-4bc5-9081-6a00053e49af" containerName="console" Apr 17 14:40:29.234461 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.234444 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.236867 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.236842 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:40:29.241642 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.241615 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z82rb"] Apr 17 14:40:29.266225 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.266193 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1c999004-4a51-40aa-b675-8213daef0914-dbus\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.266393 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.266248 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1c999004-4a51-40aa-b675-8213daef0914-kubelet-config\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.266393 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.266272 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c999004-4a51-40aa-b675-8213daef0914-original-pull-secret\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.367211 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.367154 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1c999004-4a51-40aa-b675-8213daef0914-dbus\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.367415 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.367225 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1c999004-4a51-40aa-b675-8213daef0914-kubelet-config\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.367415 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.367251 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c999004-4a51-40aa-b675-8213daef0914-original-pull-secret\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.367415 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.367330 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1c999004-4a51-40aa-b675-8213daef0914-kubelet-config\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.367415 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.367355 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1c999004-4a51-40aa-b675-8213daef0914-dbus\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.369820 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.369774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1c999004-4a51-40aa-b675-8213daef0914-original-pull-secret\") pod \"global-pull-secret-syncer-z82rb\" (UID: \"1c999004-4a51-40aa-b675-8213daef0914\") " pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.544210 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.544167 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z82rb" Apr 17 14:40:29.676244 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.676208 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z82rb"] Apr 17 14:40:29.680303 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:40:29.680260 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c999004_4a51_40aa_b675_8213daef0914.slice/crio-19c2d008840476c4034045f20bf6a65660ed388131259900f669cc6f5dc93baa WatchSource:0}: Error finding container 19c2d008840476c4034045f20bf6a65660ed388131259900f669cc6f5dc93baa: Status 404 returned error can't find the container with id 19c2d008840476c4034045f20bf6a65660ed388131259900f669cc6f5dc93baa Apr 17 14:40:29.681934 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.681915 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:40:29.959417 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:29.959328 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z82rb" event={"ID":"1c999004-4a51-40aa-b675-8213daef0914","Type":"ContainerStarted","Data":"19c2d008840476c4034045f20bf6a65660ed388131259900f669cc6f5dc93baa"} Apr 17 14:40:34.975204 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:34.975159 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z82rb" event={"ID":"1c999004-4a51-40aa-b675-8213daef0914","Type":"ContainerStarted","Data":"3a21ea8204861fd51adc3172b6b4f16e731902b7b2dc09ca8f38af2c089d3331"} Apr 17 14:40:34.989181 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:34.989128 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z82rb" podStartSLOduration=1.304115181 podStartE2EDuration="5.98910972s" podCreationTimestamp="2026-04-17 14:40:29 +0000 UTC" firstStartedPulling="2026-04-17 14:40:29.682091379 +0000 UTC m=+382.549477009" lastFinishedPulling="2026-04-17 14:40:34.367085919 +0000 UTC m=+387.234471548" observedRunningTime="2026-04-17 14:40:34.988839255 +0000 UTC m=+387.856224907" watchObservedRunningTime="2026-04-17 14:40:34.98910972 +0000 UTC m=+387.856495371" Apr 17 14:40:56.251892 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.251850 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6"] Apr 17 14:40:56.255322 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.255304 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.257634 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.257610 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:40:56.258556 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.258531 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6dr9g\"" Apr 17 14:40:56.258635 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.258533 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:40:56.262977 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.262950 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6"] Apr 17 14:40:56.374113 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.374073 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.374292 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.374124 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tls6h\" (UniqueName: \"kubernetes.io/projected/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-kube-api-access-tls6h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.374292 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.374192 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.475576 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.475541 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.475736 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.475584 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tls6h\" (UniqueName: \"kubernetes.io/projected/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-kube-api-access-tls6h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.475736 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.475609 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.475950 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.475929 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.475996 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.475981 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.484057 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.484027 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tls6h\" (UniqueName: \"kubernetes.io/projected/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-kube-api-access-tls6h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.565075 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.564992 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:40:56.689372 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:56.689074 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6"] Apr 17 14:40:56.692355 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:40:56.692204 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd23d42_510e_49b9_8244_68d9f3a9b4ab.slice/crio-6b509e71f40c35cdcd0e7d064da69c2a772b07077817a331559011a9662da2ec WatchSource:0}: Error finding container 6b509e71f40c35cdcd0e7d064da69c2a772b07077817a331559011a9662da2ec: Status 404 returned error can't find the container with id 6b509e71f40c35cdcd0e7d064da69c2a772b07077817a331559011a9662da2ec Apr 17 14:40:57.034702 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:40:57.034666 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" event={"ID":"fdd23d42-510e-49b9-8244-68d9f3a9b4ab","Type":"ContainerStarted","Data":"6b509e71f40c35cdcd0e7d064da69c2a772b07077817a331559011a9662da2ec"} Apr 17 14:41:03.054046 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:03.054006 2582 generic.go:358] "Generic (PLEG): container finished" podID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerID="32eb4d7e50fbe8732b6c9512ed1a661beefb26e5135b7f71244019ada035eda5" exitCode=0 Apr 17 14:41:03.054494 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:03.054055 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" event={"ID":"fdd23d42-510e-49b9-8244-68d9f3a9b4ab","Type":"ContainerDied","Data":"32eb4d7e50fbe8732b6c9512ed1a661beefb26e5135b7f71244019ada035eda5"} Apr 17 14:41:06.066134 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:06.066093 2582 generic.go:358] "Generic (PLEG): container finished" podID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerID="2e75282a54acb7cb02ff5d2b4a411fb481426890a154dfaf4af60a903ffe02e1" exitCode=0 Apr 17 14:41:06.066517 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:06.066153 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" event={"ID":"fdd23d42-510e-49b9-8244-68d9f3a9b4ab","Type":"ContainerDied","Data":"2e75282a54acb7cb02ff5d2b4a411fb481426890a154dfaf4af60a903ffe02e1"} Apr 17 14:41:13.089168 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:13.089129 2582 generic.go:358] "Generic (PLEG): container finished" podID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerID="70981a8fdb6fd13bd0df4be9e2c5a1ee0888b249c77d2a46f96222cd8fbddfb1" exitCode=0 Apr 17 14:41:13.089168 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:13.089172 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" event={"ID":"fdd23d42-510e-49b9-8244-68d9f3a9b4ab","Type":"ContainerDied","Data":"70981a8fdb6fd13bd0df4be9e2c5a1ee0888b249c77d2a46f96222cd8fbddfb1"} Apr 17 14:41:14.210133 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.210110 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:41:14.328091 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.328052 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-util\") pod \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " Apr 17 14:41:14.328091 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.328093 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tls6h\" (UniqueName: \"kubernetes.io/projected/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-kube-api-access-tls6h\") pod \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " Apr 17 14:41:14.328331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.328114 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-bundle\") pod \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\" (UID: \"fdd23d42-510e-49b9-8244-68d9f3a9b4ab\") " Apr 17 14:41:14.328756 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.328731 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-bundle" (OuterVolumeSpecName: "bundle") pod "fdd23d42-510e-49b9-8244-68d9f3a9b4ab" (UID: "fdd23d42-510e-49b9-8244-68d9f3a9b4ab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:14.330384 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.330355 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-kube-api-access-tls6h" (OuterVolumeSpecName: "kube-api-access-tls6h") pod "fdd23d42-510e-49b9-8244-68d9f3a9b4ab" (UID: "fdd23d42-510e-49b9-8244-68d9f3a9b4ab"). InnerVolumeSpecName "kube-api-access-tls6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:41:14.332913 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.332878 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-util" (OuterVolumeSpecName: "util") pod "fdd23d42-510e-49b9-8244-68d9f3a9b4ab" (UID: "fdd23d42-510e-49b9-8244-68d9f3a9b4ab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:14.428710 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.428605 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:14.428710 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.428649 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tls6h\" (UniqueName: \"kubernetes.io/projected/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-kube-api-access-tls6h\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:14.428710 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:14.428666 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fdd23d42-510e-49b9-8244-68d9f3a9b4ab-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:15.097533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:15.097496 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" event={"ID":"fdd23d42-510e-49b9-8244-68d9f3a9b4ab","Type":"ContainerDied","Data":"6b509e71f40c35cdcd0e7d064da69c2a772b07077817a331559011a9662da2ec"} Apr 17 14:41:15.097533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:15.097538 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b509e71f40c35cdcd0e7d064da69c2a772b07077817a331559011a9662da2ec" Apr 17 14:41:15.097733 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:15.097515 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxkr6" Apr 17 14:41:24.431125 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431031 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv"] Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431326 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="extract" Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431337 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="extract" Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431356 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="pull" Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431361 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="pull" Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431375 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="util" Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431380 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="util" Apr 17 14:41:24.431515 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.431426 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdd23d42-510e-49b9-8244-68d9f3a9b4ab" containerName="extract" Apr 17 14:41:24.435840 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.435816 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.437996 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.437970 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:41:24.438125 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.438013 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6dr9g\"" Apr 17 14:41:24.438902 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.438881 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:41:24.444128 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.444100 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv"] Apr 17 14:41:24.610288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.610246 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.610288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.610290 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.610509 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.610312 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnsjj\" (UniqueName: \"kubernetes.io/projected/2603b490-5f1c-4381-aaba-62202f2dcd6a-kube-api-access-fnsjj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.711186 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.711069 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.711186 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.711126 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.711186 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.711147 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnsjj\" (UniqueName: \"kubernetes.io/projected/2603b490-5f1c-4381-aaba-62202f2dcd6a-kube-api-access-fnsjj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.711481 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.711459 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.711595 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.711576 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.719374 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.719340 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnsjj\" (UniqueName: \"kubernetes.io/projected/2603b490-5f1c-4381-aaba-62202f2dcd6a-kube-api-access-fnsjj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.745503 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.745474 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:24.866460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:24.866433 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv"] Apr 17 14:41:24.869139 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:24.869106 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2603b490_5f1c_4381_aaba_62202f2dcd6a.slice/crio-b2330ba967e6ba114129afcc613f7c81f571f1c6ef029b19b66a95c4f047a7d0 WatchSource:0}: Error finding container b2330ba967e6ba114129afcc613f7c81f571f1c6ef029b19b66a95c4f047a7d0: Status 404 returned error can't find the container with id b2330ba967e6ba114129afcc613f7c81f571f1c6ef029b19b66a95c4f047a7d0 Apr 17 14:41:25.111849 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.111786 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-v588c"] Apr 17 14:41:25.115000 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.114977 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.117234 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.117210 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:41:25.117337 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.117296 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-hstbs\"" Apr 17 14:41:25.117337 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.117297 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:41:25.121756 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.121729 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-v588c"] Apr 17 14:41:25.128242 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.128211 2582 generic.go:358] "Generic (PLEG): container finished" podID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerID="1a8175db6b5be30cc69c369bad2ef7c14e74fd92ec68bcc5e31bd66b03c6d099" exitCode=0 Apr 17 14:41:25.128355 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.128289 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" event={"ID":"2603b490-5f1c-4381-aaba-62202f2dcd6a","Type":"ContainerDied","Data":"1a8175db6b5be30cc69c369bad2ef7c14e74fd92ec68bcc5e31bd66b03c6d099"} Apr 17 14:41:25.128355 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.128321 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" event={"ID":"2603b490-5f1c-4381-aaba-62202f2dcd6a","Type":"ContainerStarted","Data":"b2330ba967e6ba114129afcc613f7c81f571f1c6ef029b19b66a95c4f047a7d0"} Apr 17 14:41:25.215178 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.215140 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7pm\" (UniqueName: \"kubernetes.io/projected/725cae0c-bca3-4ac9-856d-386a3d32383d-kube-api-access-dk7pm\") pod \"cert-manager-webhook-597b96b99b-v588c\" (UID: \"725cae0c-bca3-4ac9-856d-386a3d32383d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.215349 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.215189 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/725cae0c-bca3-4ac9-856d-386a3d32383d-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-v588c\" (UID: \"725cae0c-bca3-4ac9-856d-386a3d32383d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.315931 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.315892 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7pm\" (UniqueName: \"kubernetes.io/projected/725cae0c-bca3-4ac9-856d-386a3d32383d-kube-api-access-dk7pm\") pod \"cert-manager-webhook-597b96b99b-v588c\" (UID: \"725cae0c-bca3-4ac9-856d-386a3d32383d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.316094 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.315937 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/725cae0c-bca3-4ac9-856d-386a3d32383d-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-v588c\" (UID: \"725cae0c-bca3-4ac9-856d-386a3d32383d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.324192 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.324163 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/725cae0c-bca3-4ac9-856d-386a3d32383d-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-v588c\" (UID: \"725cae0c-bca3-4ac9-856d-386a3d32383d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.324320 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.324302 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7pm\" (UniqueName: \"kubernetes.io/projected/725cae0c-bca3-4ac9-856d-386a3d32383d-kube-api-access-dk7pm\") pod \"cert-manager-webhook-597b96b99b-v588c\" (UID: \"725cae0c-bca3-4ac9-856d-386a3d32383d\") " pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.434113 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.434078 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:25.551942 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:25.551913 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-v588c"] Apr 17 14:41:25.554316 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:25.554286 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod725cae0c_bca3_4ac9_856d_386a3d32383d.slice/crio-7b4eb513ca0c33b5ca75e9262586e7fe1068927d283a8791c277d4af13d7b19f WatchSource:0}: Error finding container 7b4eb513ca0c33b5ca75e9262586e7fe1068927d283a8791c277d4af13d7b19f: Status 404 returned error can't find the container with id 7b4eb513ca0c33b5ca75e9262586e7fe1068927d283a8791c277d4af13d7b19f Apr 17 14:41:26.132784 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:26.132747 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" event={"ID":"725cae0c-bca3-4ac9-856d-386a3d32383d","Type":"ContainerStarted","Data":"7b4eb513ca0c33b5ca75e9262586e7fe1068927d283a8791c277d4af13d7b19f"} Apr 17 14:41:28.141947 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:28.141906 2582 generic.go:358] "Generic (PLEG): container finished" podID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerID="1da0333f6bee471cbda6df81a56f62c0ef7612dfc9f5902ab3294b1021b4f3fa" exitCode=0 Apr 17 14:41:28.142409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:28.141987 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" event={"ID":"2603b490-5f1c-4381-aaba-62202f2dcd6a","Type":"ContainerDied","Data":"1da0333f6bee471cbda6df81a56f62c0ef7612dfc9f5902ab3294b1021b4f3fa"} Apr 17 14:41:29.146568 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:29.146528 2582 generic.go:358] "Generic (PLEG): container finished" podID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerID="f6c83d8720113cd8fc40fa9da83cfae27adf9942e7e7e3977893784a28319d36" exitCode=0 Apr 17 14:41:29.147048 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:29.146592 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" event={"ID":"2603b490-5f1c-4381-aaba-62202f2dcd6a","Type":"ContainerDied","Data":"f6c83d8720113cd8fc40fa9da83cfae27adf9942e7e7e3977893784a28319d36"} Apr 17 14:41:29.148008 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:29.147982 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" event={"ID":"725cae0c-bca3-4ac9-856d-386a3d32383d","Type":"ContainerStarted","Data":"e31b80298620946f1d7457f71559c7f48de1036ac37d824ba93a87fe095dadba"} Apr 17 14:41:29.148127 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:29.148108 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:29.183111 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:29.183057 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" podStartSLOduration=1.100107144 podStartE2EDuration="4.183042416s" podCreationTimestamp="2026-04-17 14:41:25 +0000 UTC" firstStartedPulling="2026-04-17 14:41:25.556121666 +0000 UTC m=+438.423507295" lastFinishedPulling="2026-04-17 14:41:28.639056936 +0000 UTC m=+441.506442567" observedRunningTime="2026-04-17 14:41:29.181280651 +0000 UTC m=+442.048666305" watchObservedRunningTime="2026-04-17 14:41:29.183042416 +0000 UTC m=+442.050428066" Apr 17 14:41:30.276835 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.276791 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:30.455617 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.455525 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-bundle\") pod \"2603b490-5f1c-4381-aaba-62202f2dcd6a\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " Apr 17 14:41:30.455617 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.455583 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnsjj\" (UniqueName: \"kubernetes.io/projected/2603b490-5f1c-4381-aaba-62202f2dcd6a-kube-api-access-fnsjj\") pod \"2603b490-5f1c-4381-aaba-62202f2dcd6a\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " Apr 17 14:41:30.455872 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.455628 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-util\") pod \"2603b490-5f1c-4381-aaba-62202f2dcd6a\" (UID: \"2603b490-5f1c-4381-aaba-62202f2dcd6a\") " Apr 17 14:41:30.456060 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.456033 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-bundle" (OuterVolumeSpecName: "bundle") pod "2603b490-5f1c-4381-aaba-62202f2dcd6a" (UID: "2603b490-5f1c-4381-aaba-62202f2dcd6a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:30.457847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.457825 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2603b490-5f1c-4381-aaba-62202f2dcd6a-kube-api-access-fnsjj" (OuterVolumeSpecName: "kube-api-access-fnsjj") pod "2603b490-5f1c-4381-aaba-62202f2dcd6a" (UID: "2603b490-5f1c-4381-aaba-62202f2dcd6a"). InnerVolumeSpecName "kube-api-access-fnsjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:41:30.491663 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.491626 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-sgmr6"] Apr 17 14:41:30.491971 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.491956 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="extract" Apr 17 14:41:30.492035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.491972 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="extract" Apr 17 14:41:30.492035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.491995 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="util" Apr 17 14:41:30.492035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.492001 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="util" Apr 17 14:41:30.492035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.492010 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="pull" Apr 17 14:41:30.492035 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.492015 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="pull" Apr 17 14:41:30.492177 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.492063 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="2603b490-5f1c-4381-aaba-62202f2dcd6a" containerName="extract" Apr 17 14:41:30.505756 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.505718 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-sgmr6"] Apr 17 14:41:30.505756 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.505749 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.508322 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.508301 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-vtvpx\"" Apr 17 14:41:30.557074 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.557040 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnsjj\" (UniqueName: \"kubernetes.io/projected/2603b490-5f1c-4381-aaba-62202f2dcd6a-kube-api-access-fnsjj\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:30.557074 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.557066 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:30.592727 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.592686 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-util" (OuterVolumeSpecName: "util") pod "2603b490-5f1c-4381-aaba-62202f2dcd6a" (UID: "2603b490-5f1c-4381-aaba-62202f2dcd6a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:30.657481 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.657444 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ffc39f8-50a0-4cac-81dc-24fac85ad306-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-sgmr6\" (UID: \"5ffc39f8-50a0-4cac-81dc-24fac85ad306\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.657640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.657506 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfksf\" (UniqueName: \"kubernetes.io/projected/5ffc39f8-50a0-4cac-81dc-24fac85ad306-kube-api-access-bfksf\") pod \"cert-manager-cainjector-8966b78d4-sgmr6\" (UID: \"5ffc39f8-50a0-4cac-81dc-24fac85ad306\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.657640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.657560 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2603b490-5f1c-4381-aaba-62202f2dcd6a-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:30.758551 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.758519 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfksf\" (UniqueName: \"kubernetes.io/projected/5ffc39f8-50a0-4cac-81dc-24fac85ad306-kube-api-access-bfksf\") pod \"cert-manager-cainjector-8966b78d4-sgmr6\" (UID: \"5ffc39f8-50a0-4cac-81dc-24fac85ad306\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.758700 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.758593 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ffc39f8-50a0-4cac-81dc-24fac85ad306-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-sgmr6\" (UID: \"5ffc39f8-50a0-4cac-81dc-24fac85ad306\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.766194 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.766170 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ffc39f8-50a0-4cac-81dc-24fac85ad306-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-sgmr6\" (UID: \"5ffc39f8-50a0-4cac-81dc-24fac85ad306\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.766296 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.766280 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfksf\" (UniqueName: \"kubernetes.io/projected/5ffc39f8-50a0-4cac-81dc-24fac85ad306-kube-api-access-bfksf\") pod \"cert-manager-cainjector-8966b78d4-sgmr6\" (UID: \"5ffc39f8-50a0-4cac-81dc-24fac85ad306\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.815277 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.815246 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" Apr 17 14:41:30.934146 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:30.934109 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-sgmr6"] Apr 17 14:41:30.937008 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:30.936977 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ffc39f8_50a0_4cac_81dc_24fac85ad306.slice/crio-941577bbaf068dda1d4eaa9202e937b5bb60f7785d124c5bccb81a588be2bc26 WatchSource:0}: Error finding container 941577bbaf068dda1d4eaa9202e937b5bb60f7785d124c5bccb81a588be2bc26: Status 404 returned error can't find the container with id 941577bbaf068dda1d4eaa9202e937b5bb60f7785d124c5bccb81a588be2bc26 Apr 17 14:41:31.155857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:31.155707 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" event={"ID":"5ffc39f8-50a0-4cac-81dc-24fac85ad306","Type":"ContainerStarted","Data":"2565df479ce264a8325ef75a00bc5cb8cc7a0bc515311120eed7c8245702a300"} Apr 17 14:41:31.155857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:31.155769 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" event={"ID":"5ffc39f8-50a0-4cac-81dc-24fac85ad306","Type":"ContainerStarted","Data":"941577bbaf068dda1d4eaa9202e937b5bb60f7785d124c5bccb81a588be2bc26"} Apr 17 14:41:31.157437 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:31.157397 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" Apr 17 14:41:31.157559 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:31.157413 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftxmfv" event={"ID":"2603b490-5f1c-4381-aaba-62202f2dcd6a","Type":"ContainerDied","Data":"b2330ba967e6ba114129afcc613f7c81f571f1c6ef029b19b66a95c4f047a7d0"} Apr 17 14:41:31.157559 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:31.157540 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2330ba967e6ba114129afcc613f7c81f571f1c6ef029b19b66a95c4f047a7d0" Apr 17 14:41:31.171611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:31.171555 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-sgmr6" podStartSLOduration=1.171539473 podStartE2EDuration="1.171539473s" podCreationTimestamp="2026-04-17 14:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:41:31.169933061 +0000 UTC m=+444.037318714" watchObservedRunningTime="2026-04-17 14:41:31.171539473 +0000 UTC m=+444.038925125" Apr 17 14:41:35.153785 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:35.153750 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-v588c" Apr 17 14:41:37.220374 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.220335 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl"] Apr 17 14:41:37.227058 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.227031 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.229605 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.229579 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:41:37.230015 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.229985 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl"] Apr 17 14:41:37.230664 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.230645 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 14:41:37.230729 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.230645 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-mp7h8\"" Apr 17 14:41:37.314859 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.314784 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e2a754c-4d22-4c25-b310-d76568637da3-tmp\") pod \"openshift-lws-operator-bfc7f696d-c46sl\" (UID: \"1e2a754c-4d22-4c25-b310-d76568637da3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.314859 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.314851 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l55qr\" (UniqueName: \"kubernetes.io/projected/1e2a754c-4d22-4c25-b310-d76568637da3-kube-api-access-l55qr\") pod \"openshift-lws-operator-bfc7f696d-c46sl\" (UID: \"1e2a754c-4d22-4c25-b310-d76568637da3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.415389 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.415354 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e2a754c-4d22-4c25-b310-d76568637da3-tmp\") pod \"openshift-lws-operator-bfc7f696d-c46sl\" (UID: \"1e2a754c-4d22-4c25-b310-d76568637da3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.415389 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.415390 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l55qr\" (UniqueName: \"kubernetes.io/projected/1e2a754c-4d22-4c25-b310-d76568637da3-kube-api-access-l55qr\") pod \"openshift-lws-operator-bfc7f696d-c46sl\" (UID: \"1e2a754c-4d22-4c25-b310-d76568637da3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.415697 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.415678 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e2a754c-4d22-4c25-b310-d76568637da3-tmp\") pod \"openshift-lws-operator-bfc7f696d-c46sl\" (UID: \"1e2a754c-4d22-4c25-b310-d76568637da3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.423681 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.423649 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l55qr\" (UniqueName: \"kubernetes.io/projected/1e2a754c-4d22-4c25-b310-d76568637da3-kube-api-access-l55qr\") pod \"openshift-lws-operator-bfc7f696d-c46sl\" (UID: \"1e2a754c-4d22-4c25-b310-d76568637da3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.537741 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.537704 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" Apr 17 14:41:37.660065 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:37.659856 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl"] Apr 17 14:41:38.180906 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:38.180868 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" event={"ID":"1e2a754c-4d22-4c25-b310-d76568637da3","Type":"ContainerStarted","Data":"eefd14ee5d6be18118b0e0f9ffdf524010263474d77e4124a639b81df9d7e756"} Apr 17 14:41:40.189584 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:40.189544 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" event={"ID":"1e2a754c-4d22-4c25-b310-d76568637da3","Type":"ContainerStarted","Data":"a6be2f2d32753e1406df5f7e6958fa8914833e2182bfb0aa81c23c1ecb61be9f"} Apr 17 14:41:40.206141 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:40.206081 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-c46sl" podStartSLOduration=1.381129636 podStartE2EDuration="3.206062042s" podCreationTimestamp="2026-04-17 14:41:37 +0000 UTC" firstStartedPulling="2026-04-17 14:41:37.664625739 +0000 UTC m=+450.532011371" lastFinishedPulling="2026-04-17 14:41:39.489558148 +0000 UTC m=+452.356943777" observedRunningTime="2026-04-17 14:41:40.204457421 +0000 UTC m=+453.071843072" watchObservedRunningTime="2026-04-17 14:41:40.206062042 +0000 UTC m=+453.073447693" Apr 17 14:41:43.237825 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.234848 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-mljhl"] Apr 17 14:41:43.256341 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.256310 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mljhl"] Apr 17 14:41:43.256490 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.256415 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.257700 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.257681 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77caa693-3579-44a8-b25a-5f7b54085ba6-bound-sa-token\") pod \"cert-manager-759f64656b-mljhl\" (UID: \"77caa693-3579-44a8-b25a-5f7b54085ba6\") " pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.257746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.257718 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6657v\" (UniqueName: \"kubernetes.io/projected/77caa693-3579-44a8-b25a-5f7b54085ba6-kube-api-access-6657v\") pod \"cert-manager-759f64656b-mljhl\" (UID: \"77caa693-3579-44a8-b25a-5f7b54085ba6\") " pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.258750 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.258728 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-fwv7d\"" Apr 17 14:41:43.358582 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.358537 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77caa693-3579-44a8-b25a-5f7b54085ba6-bound-sa-token\") pod \"cert-manager-759f64656b-mljhl\" (UID: \"77caa693-3579-44a8-b25a-5f7b54085ba6\") " pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.358582 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.358590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6657v\" (UniqueName: \"kubernetes.io/projected/77caa693-3579-44a8-b25a-5f7b54085ba6-kube-api-access-6657v\") pod \"cert-manager-759f64656b-mljhl\" (UID: \"77caa693-3579-44a8-b25a-5f7b54085ba6\") " pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.366287 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.366255 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77caa693-3579-44a8-b25a-5f7b54085ba6-bound-sa-token\") pod \"cert-manager-759f64656b-mljhl\" (UID: \"77caa693-3579-44a8-b25a-5f7b54085ba6\") " pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.366462 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.366440 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6657v\" (UniqueName: \"kubernetes.io/projected/77caa693-3579-44a8-b25a-5f7b54085ba6-kube-api-access-6657v\") pod \"cert-manager-759f64656b-mljhl\" (UID: \"77caa693-3579-44a8-b25a-5f7b54085ba6\") " pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.565011 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.564961 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mljhl" Apr 17 14:41:43.694202 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:43.694176 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mljhl"] Apr 17 14:41:43.696322 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:43.696291 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77caa693_3579_44a8_b25a_5f7b54085ba6.slice/crio-dc10b0806e9516c655fc290bb5db47fe28aa2f36c1f63f23965e74d14ee57d5b WatchSource:0}: Error finding container dc10b0806e9516c655fc290bb5db47fe28aa2f36c1f63f23965e74d14ee57d5b: Status 404 returned error can't find the container with id dc10b0806e9516c655fc290bb5db47fe28aa2f36c1f63f23965e74d14ee57d5b Apr 17 14:41:44.204467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.204427 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mljhl" event={"ID":"77caa693-3579-44a8-b25a-5f7b54085ba6","Type":"ContainerStarted","Data":"24efdc961029c896660e0c938da170c57dbf0e624f54390163c863e63cc98d81"} Apr 17 14:41:44.204467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.204472 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mljhl" event={"ID":"77caa693-3579-44a8-b25a-5f7b54085ba6","Type":"ContainerStarted","Data":"dc10b0806e9516c655fc290bb5db47fe28aa2f36c1f63f23965e74d14ee57d5b"} Apr 17 14:41:44.238702 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.238651 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-mljhl" podStartSLOduration=1.2386367090000001 podStartE2EDuration="1.238636709s" podCreationTimestamp="2026-04-17 14:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:41:44.238131155 +0000 UTC m=+457.105516805" watchObservedRunningTime="2026-04-17 14:41:44.238636709 +0000 UTC m=+457.106022361" Apr 17 14:41:44.595884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.595846 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd"] Apr 17 14:41:44.621703 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.621673 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd"] Apr 17 14:41:44.621872 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.621824 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.624911 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.624888 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:41:44.625534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.625520 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:41:44.625587 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.625520 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6dr9g\"" Apr 17 14:41:44.669395 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.669355 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.669556 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.669464 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhhq\" (UniqueName: \"kubernetes.io/projected/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-kube-api-access-2vhhq\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.669556 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.669519 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.770635 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.770594 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhhq\" (UniqueName: \"kubernetes.io/projected/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-kube-api-access-2vhhq\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.770882 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.770680 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.770882 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.770714 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.771150 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.771129 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.771185 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.771157 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.778870 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.778842 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhhq\" (UniqueName: \"kubernetes.io/projected/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-kube-api-access-2vhhq\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:44.932216 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:44.932129 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:45.057371 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:45.057346 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd"] Apr 17 14:41:45.059859 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:45.059825 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a16ea8_96e4_4646_ae0e_38e7f000dd39.slice/crio-dddabfc354dbd20840fde59b1e4a5c1cfcd6b9cb45519226b88dc3a6b6ea20f0 WatchSource:0}: Error finding container dddabfc354dbd20840fde59b1e4a5c1cfcd6b9cb45519226b88dc3a6b6ea20f0: Status 404 returned error can't find the container with id dddabfc354dbd20840fde59b1e4a5c1cfcd6b9cb45519226b88dc3a6b6ea20f0 Apr 17 14:41:45.209164 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:45.209072 2582 generic.go:358] "Generic (PLEG): container finished" podID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerID="455fb43de86adcaaa77ef31db90f270e119f868f7e5a7e36ba85f2a317dd039a" exitCode=0 Apr 17 14:41:45.209328 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:45.209158 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" event={"ID":"b5a16ea8-96e4-4646-ae0e-38e7f000dd39","Type":"ContainerDied","Data":"455fb43de86adcaaa77ef31db90f270e119f868f7e5a7e36ba85f2a317dd039a"} Apr 17 14:41:45.209328 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:45.209199 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" event={"ID":"b5a16ea8-96e4-4646-ae0e-38e7f000dd39","Type":"ContainerStarted","Data":"dddabfc354dbd20840fde59b1e4a5c1cfcd6b9cb45519226b88dc3a6b6ea20f0"} Apr 17 14:41:46.214159 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:46.214060 2582 generic.go:358] "Generic (PLEG): container finished" podID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerID="8deb69f5ac647a07a01f0c84fec673667a4b308fde9745eb9bbd2b0b681771b8" exitCode=0 Apr 17 14:41:46.214159 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:46.214141 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" event={"ID":"b5a16ea8-96e4-4646-ae0e-38e7f000dd39","Type":"ContainerDied","Data":"8deb69f5ac647a07a01f0c84fec673667a4b308fde9745eb9bbd2b0b681771b8"} Apr 17 14:41:46.379959 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:41:46.379927 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a16ea8_96e4_4646_ae0e_38e7f000dd39.slice/crio-conmon-3413af44102d7db726eaa5c88aee37cc5dc0626ba2ed6853f86017326dac2a29.scope\": RecentStats: unable to find data in memory cache]" Apr 17 14:41:46.380093 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:41:46.379994 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a16ea8_96e4_4646_ae0e_38e7f000dd39.slice/crio-conmon-3413af44102d7db726eaa5c88aee37cc5dc0626ba2ed6853f86017326dac2a29.scope\": RecentStats: unable to find data in memory cache]" Apr 17 14:41:47.218984 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:47.218946 2582 generic.go:358] "Generic (PLEG): container finished" podID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerID="3413af44102d7db726eaa5c88aee37cc5dc0626ba2ed6853f86017326dac2a29" exitCode=0 Apr 17 14:41:47.219372 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:47.219029 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" event={"ID":"b5a16ea8-96e4-4646-ae0e-38e7f000dd39","Type":"ContainerDied","Data":"3413af44102d7db726eaa5c88aee37cc5dc0626ba2ed6853f86017326dac2a29"} Apr 17 14:41:48.343507 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.343483 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:48.400343 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.400303 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vhhq\" (UniqueName: \"kubernetes.io/projected/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-kube-api-access-2vhhq\") pod \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " Apr 17 14:41:48.400514 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.400394 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-bundle\") pod \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " Apr 17 14:41:48.400588 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.400560 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-util\") pod \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\" (UID: \"b5a16ea8-96e4-4646-ae0e-38e7f000dd39\") " Apr 17 14:41:48.401150 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.401122 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-bundle" (OuterVolumeSpecName: "bundle") pod "b5a16ea8-96e4-4646-ae0e-38e7f000dd39" (UID: "b5a16ea8-96e4-4646-ae0e-38e7f000dd39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:48.402648 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.402629 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-kube-api-access-2vhhq" (OuterVolumeSpecName: "kube-api-access-2vhhq") pod "b5a16ea8-96e4-4646-ae0e-38e7f000dd39" (UID: "b5a16ea8-96e4-4646-ae0e-38e7f000dd39"). InnerVolumeSpecName "kube-api-access-2vhhq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:41:48.405645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.405603 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-util" (OuterVolumeSpecName: "util") pod "b5a16ea8-96e4-4646-ae0e-38e7f000dd39" (UID: "b5a16ea8-96e4-4646-ae0e-38e7f000dd39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:48.501201 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.501162 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:48.501201 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.501199 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:48.501396 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:48.501213 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vhhq\" (UniqueName: \"kubernetes.io/projected/b5a16ea8-96e4-4646-ae0e-38e7f000dd39-kube-api-access-2vhhq\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:49.227390 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:49.227351 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" event={"ID":"b5a16ea8-96e4-4646-ae0e-38e7f000dd39","Type":"ContainerDied","Data":"dddabfc354dbd20840fde59b1e4a5c1cfcd6b9cb45519226b88dc3a6b6ea20f0"} Apr 17 14:41:49.227390 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:49.227390 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dddabfc354dbd20840fde59b1e4a5c1cfcd6b9cb45519226b88dc3a6b6ea20f0" Apr 17 14:41:49.227596 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:49.227398 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r7ncd" Apr 17 14:41:54.415713 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.415670 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l"] Apr 17 14:41:54.416114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416018 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="extract" Apr 17 14:41:54.416114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416030 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="extract" Apr 17 14:41:54.416114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416047 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="pull" Apr 17 14:41:54.416114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416052 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="pull" Apr 17 14:41:54.416114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416060 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="util" Apr 17 14:41:54.416114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416064 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="util" Apr 17 14:41:54.416288 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.416119 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5a16ea8-96e4-4646-ae0e-38e7f000dd39" containerName="extract" Apr 17 14:41:54.420407 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.420390 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.423024 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.422997 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:41:54.423152 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.423106 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:41:54.423955 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.423937 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6dr9g\"" Apr 17 14:41:54.427679 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.427654 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l"] Apr 17 14:41:54.448046 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.448017 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.448198 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.448072 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.448198 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.448163 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpfsj\" (UniqueName: \"kubernetes.io/projected/fd5c4dad-6f11-467b-83ac-c913193222c2-kube-api-access-zpfsj\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.549202 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.549144 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.549391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.549239 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpfsj\" (UniqueName: \"kubernetes.io/projected/fd5c4dad-6f11-467b-83ac-c913193222c2-kube-api-access-zpfsj\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.549391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.549292 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.549629 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.549607 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.549666 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.549617 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.557282 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.557254 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpfsj\" (UniqueName: \"kubernetes.io/projected/fd5c4dad-6f11-467b-83ac-c913193222c2-kube-api-access-zpfsj\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.731032 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.730933 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:54.868950 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:54.868893 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l"] Apr 17 14:41:54.871906 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:54.871869 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5c4dad_6f11_467b_83ac_c913193222c2.slice/crio-e05bf79df27bf0864e622da47288f9908e1204ee2f4b7a411528e3724b7457df WatchSource:0}: Error finding container e05bf79df27bf0864e622da47288f9908e1204ee2f4b7a411528e3724b7457df: Status 404 returned error can't find the container with id e05bf79df27bf0864e622da47288f9908e1204ee2f4b7a411528e3724b7457df Apr 17 14:41:55.248933 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:55.248903 2582 generic.go:358] "Generic (PLEG): container finished" podID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerID="ca92be746bc01f4d6b8ecdfcb30ff493560bef88d9b7ded6a89fd8885b533444" exitCode=0 Apr 17 14:41:55.249100 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:55.248994 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" event={"ID":"fd5c4dad-6f11-467b-83ac-c913193222c2","Type":"ContainerDied","Data":"ca92be746bc01f4d6b8ecdfcb30ff493560bef88d9b7ded6a89fd8885b533444"} Apr 17 14:41:55.249100 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:55.249030 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" event={"ID":"fd5c4dad-6f11-467b-83ac-c913193222c2","Type":"ContainerStarted","Data":"e05bf79df27bf0864e622da47288f9908e1204ee2f4b7a411528e3724b7457df"} Apr 17 14:41:56.253989 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.253958 2582 generic.go:358] "Generic (PLEG): container finished" podID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerID="606b1df378ac9825da348cb92c33b49aede566cadea63f47cd5bfc4f8eef3964" exitCode=0 Apr 17 14:41:56.254405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.254040 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" event={"ID":"fd5c4dad-6f11-467b-83ac-c913193222c2","Type":"ContainerDied","Data":"606b1df378ac9825da348cb92c33b49aede566cadea63f47cd5bfc4f8eef3964"} Apr 17 14:41:56.296604 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.296572 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk"] Apr 17 14:41:56.301105 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.301082 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.304318 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.304286 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 14:41:56.304318 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.304307 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 14:41:56.304631 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.304616 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-bfw6w\"" Apr 17 14:41:56.304695 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.304639 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:41:56.304946 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.304917 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:41:56.317598 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.317567 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk"] Apr 17 14:41:56.365528 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.365475 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f87942f0-7ec2-4f54-a6a5-1b433d74a993-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.365692 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.365614 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f87942f0-7ec2-4f54-a6a5-1b433d74a993-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.365692 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.365653 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm8pz\" (UniqueName: \"kubernetes.io/projected/f87942f0-7ec2-4f54-a6a5-1b433d74a993-kube-api-access-gm8pz\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.466548 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.466506 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f87942f0-7ec2-4f54-a6a5-1b433d74a993-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.466743 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.466558 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f87942f0-7ec2-4f54-a6a5-1b433d74a993-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.466743 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.466588 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm8pz\" (UniqueName: \"kubernetes.io/projected/f87942f0-7ec2-4f54-a6a5-1b433d74a993-kube-api-access-gm8pz\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.469203 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.469175 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f87942f0-7ec2-4f54-a6a5-1b433d74a993-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.469307 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.469203 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f87942f0-7ec2-4f54-a6a5-1b433d74a993-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.480584 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.480549 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm8pz\" (UniqueName: \"kubernetes.io/projected/f87942f0-7ec2-4f54-a6a5-1b433d74a993-kube-api-access-gm8pz\") pod \"opendatahub-operator-controller-manager-6c585549bc-vpjwk\" (UID: \"f87942f0-7ec2-4f54-a6a5-1b433d74a993\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.619054 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.618955 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:56.760992 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:56.760968 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk"] Apr 17 14:41:56.763330 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:41:56.763283 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf87942f0_7ec2_4f54_a6a5_1b433d74a993.slice/crio-51d791beae58ac9dea564769f3c9926d2bc466faebc55113b2262aa6d1565040 WatchSource:0}: Error finding container 51d791beae58ac9dea564769f3c9926d2bc466faebc55113b2262aa6d1565040: Status 404 returned error can't find the container with id 51d791beae58ac9dea564769f3c9926d2bc466faebc55113b2262aa6d1565040 Apr 17 14:41:57.260183 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:57.260144 2582 generic.go:358] "Generic (PLEG): container finished" podID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerID="d6028606a59d242b9be2f509a80e87ee6ebf2af0985339833bd94768a85589be" exitCode=0 Apr 17 14:41:57.260633 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:57.260255 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" event={"ID":"fd5c4dad-6f11-467b-83ac-c913193222c2","Type":"ContainerDied","Data":"d6028606a59d242b9be2f509a80e87ee6ebf2af0985339833bd94768a85589be"} Apr 17 14:41:57.261610 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:57.261580 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" event={"ID":"f87942f0-7ec2-4f54-a6a5-1b433d74a993","Type":"ContainerStarted","Data":"51d791beae58ac9dea564769f3c9926d2bc466faebc55113b2262aa6d1565040"} Apr 17 14:41:59.121658 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.121634 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:59.210332 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.210307 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpfsj\" (UniqueName: \"kubernetes.io/projected/fd5c4dad-6f11-467b-83ac-c913193222c2-kube-api-access-zpfsj\") pod \"fd5c4dad-6f11-467b-83ac-c913193222c2\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " Apr 17 14:41:59.210476 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.210344 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-util\") pod \"fd5c4dad-6f11-467b-83ac-c913193222c2\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " Apr 17 14:41:59.210476 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.210381 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-bundle\") pod \"fd5c4dad-6f11-467b-83ac-c913193222c2\" (UID: \"fd5c4dad-6f11-467b-83ac-c913193222c2\") " Apr 17 14:41:59.211308 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.211284 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-bundle" (OuterVolumeSpecName: "bundle") pod "fd5c4dad-6f11-467b-83ac-c913193222c2" (UID: "fd5c4dad-6f11-467b-83ac-c913193222c2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:59.212388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.212364 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5c4dad-6f11-467b-83ac-c913193222c2-kube-api-access-zpfsj" (OuterVolumeSpecName: "kube-api-access-zpfsj") pod "fd5c4dad-6f11-467b-83ac-c913193222c2" (UID: "fd5c4dad-6f11-467b-83ac-c913193222c2"). InnerVolumeSpecName "kube-api-access-zpfsj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:41:59.215888 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.215865 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-util" (OuterVolumeSpecName: "util") pod "fd5c4dad-6f11-467b-83ac-c913193222c2" (UID: "fd5c4dad-6f11-467b-83ac-c913193222c2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:41:59.271620 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.271579 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" event={"ID":"f87942f0-7ec2-4f54-a6a5-1b433d74a993","Type":"ContainerStarted","Data":"0ccb3b29ea98d951997ac684554781f17017553595842300e1abe4be2dba2b66"} Apr 17 14:41:59.271881 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.271723 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:41:59.273593 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.273563 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" event={"ID":"fd5c4dad-6f11-467b-83ac-c913193222c2","Type":"ContainerDied","Data":"e05bf79df27bf0864e622da47288f9908e1204ee2f4b7a411528e3724b7457df"} Apr 17 14:41:59.273741 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.273598 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05bf79df27bf0864e622da47288f9908e1204ee2f4b7a411528e3724b7457df" Apr 17 14:41:59.280423 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.273919 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94qc4l" Apr 17 14:41:59.297625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.297576 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" podStartSLOduration=0.90738606 podStartE2EDuration="3.297563026s" podCreationTimestamp="2026-04-17 14:41:56 +0000 UTC" firstStartedPulling="2026-04-17 14:41:56.765223268 +0000 UTC m=+469.632608910" lastFinishedPulling="2026-04-17 14:41:59.155400235 +0000 UTC m=+472.022785876" observedRunningTime="2026-04-17 14:41:59.296092356 +0000 UTC m=+472.163478007" watchObservedRunningTime="2026-04-17 14:41:59.297563026 +0000 UTC m=+472.164948675" Apr 17 14:41:59.311418 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.311387 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpfsj\" (UniqueName: \"kubernetes.io/projected/fd5c4dad-6f11-467b-83ac-c913193222c2-kube-api-access-zpfsj\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:59.311418 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.311413 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:41:59.311418 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:41:59.311422 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd5c4dad-6f11-467b-83ac-c913193222c2-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:10.280130 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:10.280092 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-vpjwk" Apr 17 14:42:13.200106 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.200071 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg"] Apr 17 14:42:13.200984 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.200955 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="pull" Apr 17 14:42:13.200984 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.200986 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="pull" Apr 17 14:42:13.201126 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.201009 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="util" Apr 17 14:42:13.201126 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.201018 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="util" Apr 17 14:42:13.201126 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.201053 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="extract" Apr 17 14:42:13.201126 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.201060 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="extract" Apr 17 14:42:13.201237 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.201139 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd5c4dad-6f11-467b-83ac-c913193222c2" containerName="extract" Apr 17 14:42:13.209562 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.209543 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.212255 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.212231 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6dr9g\"" Apr 17 14:42:13.212407 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.212314 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:42:13.212724 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.212701 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:42:13.213744 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.213721 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg"] Apr 17 14:42:13.330150 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.330113 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.330346 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.330167 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8pdq\" (UniqueName: \"kubernetes.io/projected/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-kube-api-access-w8pdq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.330346 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.330264 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.431157 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.431122 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.431309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.431181 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.431309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.431215 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8pdq\" (UniqueName: \"kubernetes.io/projected/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-kube-api-access-w8pdq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.431535 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.431515 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.431574 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.431546 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.439535 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.439501 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8pdq\" (UniqueName: \"kubernetes.io/projected/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-kube-api-access-w8pdq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.519944 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.519909 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:13.644092 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:13.644069 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg"] Apr 17 14:42:13.646523 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:42:13.646496 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8eabf0_bd29_479a_8a20_5ee3e8ee6338.slice/crio-44747c5409ce03b028efd9995c600575988824ad08381c0993ab6665e0e8ae3d WatchSource:0}: Error finding container 44747c5409ce03b028efd9995c600575988824ad08381c0993ab6665e0e8ae3d: Status 404 returned error can't find the container with id 44747c5409ce03b028efd9995c600575988824ad08381c0993ab6665e0e8ae3d Apr 17 14:42:14.329323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:14.329286 2582 generic.go:358] "Generic (PLEG): container finished" podID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerID="05a74856287c790e537b18746b846134f5dad54151163f267dda53aa905a7235" exitCode=0 Apr 17 14:42:14.329720 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:14.329367 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" event={"ID":"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338","Type":"ContainerDied","Data":"05a74856287c790e537b18746b846134f5dad54151163f267dda53aa905a7235"} Apr 17 14:42:14.329720 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:14.329399 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" event={"ID":"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338","Type":"ContainerStarted","Data":"44747c5409ce03b028efd9995c600575988824ad08381c0993ab6665e0e8ae3d"} Apr 17 14:42:16.338275 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:16.338241 2582 generic.go:358] "Generic (PLEG): container finished" podID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerID="29047c2e70a76ee021cae507596fdd502312fff1354f5d77e59726d1f7909e5d" exitCode=0 Apr 17 14:42:16.338714 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:16.338329 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" event={"ID":"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338","Type":"ContainerDied","Data":"29047c2e70a76ee021cae507596fdd502312fff1354f5d77e59726d1f7909e5d"} Apr 17 14:42:17.343606 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:17.343563 2582 generic.go:358] "Generic (PLEG): container finished" podID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerID="531893a9bce7ee3e3492ed58fd441b8831b1e9cc414f259a61cd69cae6aa1712" exitCode=0 Apr 17 14:42:17.344065 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:17.343654 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" event={"ID":"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338","Type":"ContainerDied","Data":"531893a9bce7ee3e3492ed58fd441b8831b1e9cc414f259a61cd69cae6aa1712"} Apr 17 14:42:18.472617 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.472595 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:18.580322 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.579089 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-util\") pod \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " Apr 17 14:42:18.580509 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.580432 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-bundle\") pod \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " Apr 17 14:42:18.580509 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.580475 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8pdq\" (UniqueName: \"kubernetes.io/projected/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-kube-api-access-w8pdq\") pod \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\" (UID: \"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338\") " Apr 17 14:42:18.581404 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.581362 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-bundle" (OuterVolumeSpecName: "bundle") pod "ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" (UID: "ac8eabf0-bd29-479a-8a20-5ee3e8ee6338"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:42:18.583000 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.582974 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-kube-api-access-w8pdq" (OuterVolumeSpecName: "kube-api-access-w8pdq") pod "ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" (UID: "ac8eabf0-bd29-479a-8a20-5ee3e8ee6338"). InnerVolumeSpecName "kube-api-access-w8pdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:42:18.588069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.588040 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-util" (OuterVolumeSpecName: "util") pod "ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" (UID: "ac8eabf0-bd29-479a-8a20-5ee3e8ee6338"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:42:18.682129 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.682028 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:18.682129 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.682074 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:18.682129 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:18.682085 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8pdq\" (UniqueName: \"kubernetes.io/projected/ac8eabf0-bd29-479a-8a20-5ee3e8ee6338-kube-api-access-w8pdq\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:19.119066 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119029 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6"] Apr 17 14:42:19.119380 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119366 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="util" Apr 17 14:42:19.119424 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119387 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="util" Apr 17 14:42:19.119424 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119408 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="pull" Apr 17 14:42:19.119424 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119417 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="pull" Apr 17 14:42:19.119534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119437 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="extract" Apr 17 14:42:19.119534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119443 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="extract" Apr 17 14:42:19.119534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.119500 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac8eabf0-bd29-479a-8a20-5ee3e8ee6338" containerName="extract" Apr 17 14:42:19.123936 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.123914 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.126645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.126623 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 14:42:19.127827 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.127784 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 14:42:19.127927 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.127788 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bbn7n\"" Apr 17 14:42:19.127927 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.127840 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 14:42:19.131666 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.131642 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6"] Apr 17 14:42:19.287176 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.287118 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a9b3f507-f98e-4518-bd22-ab9ce481958d-manager-config\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.287365 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.287243 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9pjx\" (UniqueName: \"kubernetes.io/projected/a9b3f507-f98e-4518-bd22-ab9ce481958d-kube-api-access-v9pjx\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.287365 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.287316 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b3f507-f98e-4518-bd22-ab9ce481958d-cert\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.287365 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.287338 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9b3f507-f98e-4518-bd22-ab9ce481958d-metrics-cert\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.351832 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.351787 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" Apr 17 14:42:19.351995 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.351785 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359n2sg" event={"ID":"ac8eabf0-bd29-479a-8a20-5ee3e8ee6338","Type":"ContainerDied","Data":"44747c5409ce03b028efd9995c600575988824ad08381c0993ab6665e0e8ae3d"} Apr 17 14:42:19.351995 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.351909 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44747c5409ce03b028efd9995c600575988824ad08381c0993ab6665e0e8ae3d" Apr 17 14:42:19.387817 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.387715 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a9b3f507-f98e-4518-bd22-ab9ce481958d-manager-config\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.387817 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.387753 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9pjx\" (UniqueName: \"kubernetes.io/projected/a9b3f507-f98e-4518-bd22-ab9ce481958d-kube-api-access-v9pjx\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.387817 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.387777 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b3f507-f98e-4518-bd22-ab9ce481958d-cert\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.388083 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.387832 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9b3f507-f98e-4518-bd22-ab9ce481958d-metrics-cert\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.388436 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.388405 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a9b3f507-f98e-4518-bd22-ab9ce481958d-manager-config\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.390432 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.390408 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9b3f507-f98e-4518-bd22-ab9ce481958d-cert\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.390641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.390619 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9b3f507-f98e-4518-bd22-ab9ce481958d-metrics-cert\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.395230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.395206 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9pjx\" (UniqueName: \"kubernetes.io/projected/a9b3f507-f98e-4518-bd22-ab9ce481958d-kube-api-access-v9pjx\") pod \"lws-controller-manager-66f567c4b6-b4tb6\" (UID: \"a9b3f507-f98e-4518-bd22-ab9ce481958d\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.434796 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.434761 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:19.559151 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:19.559123 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6"] Apr 17 14:42:19.560995 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:42:19.560967 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9b3f507_f98e_4518_bd22_ab9ce481958d.slice/crio-a5ca95ca5ed4ca1166f952a81d29a118801dfcc86677004c10fdd2b92845e900 WatchSource:0}: Error finding container a5ca95ca5ed4ca1166f952a81d29a118801dfcc86677004c10fdd2b92845e900: Status 404 returned error can't find the container with id a5ca95ca5ed4ca1166f952a81d29a118801dfcc86677004c10fdd2b92845e900 Apr 17 14:42:20.356622 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:20.356584 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" event={"ID":"a9b3f507-f98e-4518-bd22-ab9ce481958d","Type":"ContainerStarted","Data":"a5ca95ca5ed4ca1166f952a81d29a118801dfcc86677004c10fdd2b92845e900"} Apr 17 14:42:22.364990 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:22.364938 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" event={"ID":"a9b3f507-f98e-4518-bd22-ab9ce481958d","Type":"ContainerStarted","Data":"131f473488415c7d8ad5a52907b4349ffc91002cd230e7590a9965339339246c"} Apr 17 14:42:22.365388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:22.365012 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:22.386064 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:22.386019 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" podStartSLOduration=1.5598073129999999 podStartE2EDuration="3.385984522s" podCreationTimestamp="2026-04-17 14:42:19 +0000 UTC" firstStartedPulling="2026-04-17 14:42:19.562841448 +0000 UTC m=+492.430227077" lastFinishedPulling="2026-04-17 14:42:21.389018656 +0000 UTC m=+494.256404286" observedRunningTime="2026-04-17 14:42:22.385165375 +0000 UTC m=+495.252551025" watchObservedRunningTime="2026-04-17 14:42:22.385984522 +0000 UTC m=+495.253370172" Apr 17 14:42:27.388486 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.388442 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577"] Apr 17 14:42:27.391954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.391935 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.394708 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.394682 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:42:27.395489 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.395470 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6dr9g\"" Apr 17 14:42:27.395856 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.395839 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:42:27.404961 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.404935 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577"] Apr 17 14:42:27.557187 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.557155 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.557187 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.557195 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl2pk\" (UniqueName: \"kubernetes.io/projected/0a78b2c4-4917-43d1-b27a-114c9507ad7f-kube-api-access-jl2pk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.557407 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.557218 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.658110 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.658012 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.658110 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.658054 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl2pk\" (UniqueName: \"kubernetes.io/projected/0a78b2c4-4917-43d1-b27a-114c9507ad7f-kube-api-access-jl2pk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.658110 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.658080 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.658475 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.658443 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.658475 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.658467 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.670633 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.670606 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl2pk\" (UniqueName: \"kubernetes.io/projected/0a78b2c4-4917-43d1-b27a-114c9507ad7f-kube-api-access-jl2pk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.705469 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.705433 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:27.863089 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:27.863058 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577"] Apr 17 14:42:27.865373 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:42:27.865337 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a78b2c4_4917_43d1_b27a_114c9507ad7f.slice/crio-983fd013c666b9664da141be7def5bde107fefbe537b92aa443d0d0b091993d7 WatchSource:0}: Error finding container 983fd013c666b9664da141be7def5bde107fefbe537b92aa443d0d0b091993d7: Status 404 returned error can't find the container with id 983fd013c666b9664da141be7def5bde107fefbe537b92aa443d0d0b091993d7 Apr 17 14:42:28.388315 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:28.388277 2582 generic.go:358] "Generic (PLEG): container finished" podID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerID="eaca465665b19e82ef02ae0e04c2d83f93b4bf1518e4623b11bdbed4c6f6fe74" exitCode=0 Apr 17 14:42:28.388530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:28.388329 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" event={"ID":"0a78b2c4-4917-43d1-b27a-114c9507ad7f","Type":"ContainerDied","Data":"eaca465665b19e82ef02ae0e04c2d83f93b4bf1518e4623b11bdbed4c6f6fe74"} Apr 17 14:42:28.388530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:28.388353 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" event={"ID":"0a78b2c4-4917-43d1-b27a-114c9507ad7f","Type":"ContainerStarted","Data":"983fd013c666b9664da141be7def5bde107fefbe537b92aa443d0d0b091993d7"} Apr 17 14:42:29.393634 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:29.393550 2582 generic.go:358] "Generic (PLEG): container finished" podID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerID="c501a4f2747837f6d84953c700b36ec9d8f6da485f1aea173060a6c12f2ceb04" exitCode=0 Apr 17 14:42:29.393634 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:29.393599 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" event={"ID":"0a78b2c4-4917-43d1-b27a-114c9507ad7f","Type":"ContainerDied","Data":"c501a4f2747837f6d84953c700b36ec9d8f6da485f1aea173060a6c12f2ceb04"} Apr 17 14:42:30.398861 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:30.398831 2582 generic.go:358] "Generic (PLEG): container finished" podID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerID="e200e9c995ed87f5d77a3d17619323690183dc7146931d70168a5b2791d589d3" exitCode=0 Apr 17 14:42:30.399254 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:30.398915 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" event={"ID":"0a78b2c4-4917-43d1-b27a-114c9507ad7f","Type":"ContainerDied","Data":"e200e9c995ed87f5d77a3d17619323690183dc7146931d70168a5b2791d589d3"} Apr 17 14:42:31.525638 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.525611 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:31.589847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.589787 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-util\") pod \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " Apr 17 14:42:31.589847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.589847 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-bundle\") pod \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " Apr 17 14:42:31.590059 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.589901 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl2pk\" (UniqueName: \"kubernetes.io/projected/0a78b2c4-4917-43d1-b27a-114c9507ad7f-kube-api-access-jl2pk\") pod \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\" (UID: \"0a78b2c4-4917-43d1-b27a-114c9507ad7f\") " Apr 17 14:42:31.590704 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.590679 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-bundle" (OuterVolumeSpecName: "bundle") pod "0a78b2c4-4917-43d1-b27a-114c9507ad7f" (UID: "0a78b2c4-4917-43d1-b27a-114c9507ad7f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:42:31.592125 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.592099 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a78b2c4-4917-43d1-b27a-114c9507ad7f-kube-api-access-jl2pk" (OuterVolumeSpecName: "kube-api-access-jl2pk") pod "0a78b2c4-4917-43d1-b27a-114c9507ad7f" (UID: "0a78b2c4-4917-43d1-b27a-114c9507ad7f"). InnerVolumeSpecName "kube-api-access-jl2pk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:42:31.595484 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.595441 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-util" (OuterVolumeSpecName: "util") pod "0a78b2c4-4917-43d1-b27a-114c9507ad7f" (UID: "0a78b2c4-4917-43d1-b27a-114c9507ad7f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:42:31.690640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.690550 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:31.690640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.690586 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a78b2c4-4917-43d1-b27a-114c9507ad7f-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:31.690640 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:31.690596 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jl2pk\" (UniqueName: \"kubernetes.io/projected/0a78b2c4-4917-43d1-b27a-114c9507ad7f-kube-api-access-jl2pk\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:42:32.407220 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:32.407178 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" event={"ID":"0a78b2c4-4917-43d1-b27a-114c9507ad7f","Type":"ContainerDied","Data":"983fd013c666b9664da141be7def5bde107fefbe537b92aa443d0d0b091993d7"} Apr 17 14:42:32.407220 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:32.407222 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="983fd013c666b9664da141be7def5bde107fefbe537b92aa443d0d0b091993d7" Apr 17 14:42:32.407448 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:32.407267 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2rz577" Apr 17 14:42:33.371249 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:33.371217 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-b4tb6" Apr 17 14:42:48.722071 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.721847 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh"] Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722347 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="util" Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722363 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="util" Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722382 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="pull" Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722389 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="pull" Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722401 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="extract" Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722408 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="extract" Apr 17 14:42:48.722653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.722463 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a78b2c4-4917-43d1-b27a-114c9507ad7f" containerName="extract" Apr 17 14:42:48.724927 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.724907 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.727508 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.727479 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-stmdp\"" Apr 17 14:42:48.727623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.727479 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 14:42:48.727623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.727488 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:42:48.727623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.727496 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:42:48.735760 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.735730 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh"] Apr 17 14:42:48.830713 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830674 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.830895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830717 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/22578833-852e-415f-929c-8d9010f87cee-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.830895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830748 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.830895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830776 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.830895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830791 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/22578833-852e-415f-929c-8d9010f87cee-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.830895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830868 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.831108 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830907 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tzbw\" (UniqueName: \"kubernetes.io/projected/22578833-852e-415f-929c-8d9010f87cee-kube-api-access-9tzbw\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.831108 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830933 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.831108 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.830964 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/22578833-852e-415f-929c-8d9010f87cee-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932023 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.931983 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932216 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932046 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tzbw\" (UniqueName: \"kubernetes.io/projected/22578833-852e-415f-929c-8d9010f87cee-kube-api-access-9tzbw\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932216 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932093 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932216 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932133 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/22578833-852e-415f-929c-8d9010f87cee-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932216 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932205 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932439 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932233 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/22578833-852e-415f-929c-8d9010f87cee-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932439 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932261 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932439 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932298 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932439 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932323 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/22578833-852e-415f-929c-8d9010f87cee-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.932860 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932829 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.933008 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.932893 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.933076 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.933016 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/22578833-852e-415f-929c-8d9010f87cee-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.933136 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.933086 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.933255 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.933237 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.935110 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.935087 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/22578833-852e-415f-929c-8d9010f87cee-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.935463 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.935441 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/22578833-852e-415f-929c-8d9010f87cee-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.939958 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.939931 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tzbw\" (UniqueName: \"kubernetes.io/projected/22578833-852e-415f-929c-8d9010f87cee-kube-api-access-9tzbw\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:48.940249 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:48.940229 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/22578833-852e-415f-929c-8d9010f87cee-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh\" (UID: \"22578833-852e-415f-929c-8d9010f87cee\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:49.040263 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:49.040218 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:49.182076 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:49.181966 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh"] Apr 17 14:42:49.185028 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:42:49.184994 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22578833_852e_415f_929c_8d9010f87cee.slice/crio-b1c0064e0547d13a5e4b6aa81e5549e4e002b7e4e4f0b78a7a90da86f4d1b7b4 WatchSource:0}: Error finding container b1c0064e0547d13a5e4b6aa81e5549e4e002b7e4e4f0b78a7a90da86f4d1b7b4: Status 404 returned error can't find the container with id b1c0064e0547d13a5e4b6aa81e5549e4e002b7e4e4f0b78a7a90da86f4d1b7b4 Apr 17 14:42:49.474792 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:49.474707 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" event={"ID":"22578833-852e-415f-929c-8d9010f87cee","Type":"ContainerStarted","Data":"b1c0064e0547d13a5e4b6aa81e5549e4e002b7e4e4f0b78a7a90da86f4d1b7b4"} Apr 17 14:42:51.521671 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:51.521635 2582 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 14:42:51.521918 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:51.521724 2582 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 14:42:51.521918 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:51.521756 2582 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 14:42:52.487307 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:52.487264 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" event={"ID":"22578833-852e-415f-929c-8d9010f87cee","Type":"ContainerStarted","Data":"9076ac7ad24a7a441306cb3784f02f21300a88495e6564b9fac9acbefe3e9d55"} Apr 17 14:42:52.507824 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:52.507745 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" podStartSLOduration=2.173857012 podStartE2EDuration="4.507726405s" podCreationTimestamp="2026-04-17 14:42:48 +0000 UTC" firstStartedPulling="2026-04-17 14:42:49.187519017 +0000 UTC m=+522.054904650" lastFinishedPulling="2026-04-17 14:42:51.521388412 +0000 UTC m=+524.388774043" observedRunningTime="2026-04-17 14:42:52.506796705 +0000 UTC m=+525.374182359" watchObservedRunningTime="2026-04-17 14:42:52.507726405 +0000 UTC m=+525.375112059" Apr 17 14:42:53.040453 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:53.040412 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:53.045317 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:53.045286 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:53.491202 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:53.491116 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:42:53.492115 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:42:53.492096 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh" Apr 17 14:43:14.200546 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.200458 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mbx5g"] Apr 17 14:43:14.203934 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.203909 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:14.206546 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.206519 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 14:43:14.207491 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.207467 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-mlwd2\"" Apr 17 14:43:14.207595 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.207464 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 14:43:14.213017 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.212991 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mbx5g"] Apr 17 14:43:14.361452 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.361414 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srzvz\" (UniqueName: \"kubernetes.io/projected/6848af81-8a88-4643-b432-1a7c1cdd6887-kube-api-access-srzvz\") pod \"kuadrant-operator-catalog-mbx5g\" (UID: \"6848af81-8a88-4643-b432-1a7c1cdd6887\") " pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:14.462678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.462585 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srzvz\" (UniqueName: \"kubernetes.io/projected/6848af81-8a88-4643-b432-1a7c1cdd6887-kube-api-access-srzvz\") pod \"kuadrant-operator-catalog-mbx5g\" (UID: \"6848af81-8a88-4643-b432-1a7c1cdd6887\") " pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:14.470535 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.470508 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srzvz\" (UniqueName: \"kubernetes.io/projected/6848af81-8a88-4643-b432-1a7c1cdd6887-kube-api-access-srzvz\") pod \"kuadrant-operator-catalog-mbx5g\" (UID: \"6848af81-8a88-4643-b432-1a7c1cdd6887\") " pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:14.515364 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.515318 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:14.575467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.575436 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mbx5g"] Apr 17 14:43:14.646405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.646378 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mbx5g"] Apr 17 14:43:14.648345 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:14.648309 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6848af81_8a88_4643_b432_1a7c1cdd6887.slice/crio-945485ccb086978c35573f182129075617c060ddb2a6f307103f7e4f64506fe8 WatchSource:0}: Error finding container 945485ccb086978c35573f182129075617c060ddb2a6f307103f7e4f64506fe8: Status 404 returned error can't find the container with id 945485ccb086978c35573f182129075617c060ddb2a6f307103f7e4f64506fe8 Apr 17 14:43:14.782343 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.782306 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lgvvl"] Apr 17 14:43:14.785109 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.785091 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:14.792281 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.792247 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lgvvl"] Apr 17 14:43:14.965886 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:14.965836 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llg5\" (UniqueName: \"kubernetes.io/projected/85d9d156-7cd5-4964-bafa-90b8956c99e2-kube-api-access-6llg5\") pod \"kuadrant-operator-catalog-lgvvl\" (UID: \"85d9d156-7cd5-4964-bafa-90b8956c99e2\") " pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:15.066643 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:15.066533 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6llg5\" (UniqueName: \"kubernetes.io/projected/85d9d156-7cd5-4964-bafa-90b8956c99e2-kube-api-access-6llg5\") pod \"kuadrant-operator-catalog-lgvvl\" (UID: \"85d9d156-7cd5-4964-bafa-90b8956c99e2\") " pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:15.074523 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:15.074495 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6llg5\" (UniqueName: \"kubernetes.io/projected/85d9d156-7cd5-4964-bafa-90b8956c99e2-kube-api-access-6llg5\") pod \"kuadrant-operator-catalog-lgvvl\" (UID: \"85d9d156-7cd5-4964-bafa-90b8956c99e2\") " pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:15.098492 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:15.098452 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:15.240292 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:15.240212 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-lgvvl"] Apr 17 14:43:15.244179 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:15.244138 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85d9d156_7cd5_4964_bafa_90b8956c99e2.slice/crio-bb84d48b7d1447c35a05ba11e14bbc17c859c4ff977394fe80f3a1b3e054164d WatchSource:0}: Error finding container bb84d48b7d1447c35a05ba11e14bbc17c859c4ff977394fe80f3a1b3e054164d: Status 404 returned error can't find the container with id bb84d48b7d1447c35a05ba11e14bbc17c859c4ff977394fe80f3a1b3e054164d Apr 17 14:43:15.572952 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:15.572905 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" event={"ID":"85d9d156-7cd5-4964-bafa-90b8956c99e2","Type":"ContainerStarted","Data":"bb84d48b7d1447c35a05ba11e14bbc17c859c4ff977394fe80f3a1b3e054164d"} Apr 17 14:43:15.574186 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:15.574143 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" event={"ID":"6848af81-8a88-4643-b432-1a7c1cdd6887","Type":"ContainerStarted","Data":"945485ccb086978c35573f182129075617c060ddb2a6f307103f7e4f64506fe8"} Apr 17 14:43:17.583917 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.583867 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" event={"ID":"6848af81-8a88-4643-b432-1a7c1cdd6887","Type":"ContainerStarted","Data":"69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83"} Apr 17 14:43:17.584390 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.583941 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" podUID="6848af81-8a88-4643-b432-1a7c1cdd6887" containerName="registry-server" containerID="cri-o://69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83" gracePeriod=2 Apr 17 14:43:17.585433 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.585401 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" event={"ID":"85d9d156-7cd5-4964-bafa-90b8956c99e2","Type":"ContainerStarted","Data":"7c566f6140cad16be83be090f2893ec02322811238c26ba97efa68eef8d66e17"} Apr 17 14:43:17.599751 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.599656 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" podStartSLOduration=1.374638842 podStartE2EDuration="3.599641658s" podCreationTimestamp="2026-04-17 14:43:14 +0000 UTC" firstStartedPulling="2026-04-17 14:43:14.649691616 +0000 UTC m=+547.517077244" lastFinishedPulling="2026-04-17 14:43:16.87469443 +0000 UTC m=+549.742080060" observedRunningTime="2026-04-17 14:43:17.59704371 +0000 UTC m=+550.464429361" watchObservedRunningTime="2026-04-17 14:43:17.599641658 +0000 UTC m=+550.467027345" Apr 17 14:43:17.613174 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.613121 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" podStartSLOduration=1.981441824 podStartE2EDuration="3.613106143s" podCreationTimestamp="2026-04-17 14:43:14 +0000 UTC" firstStartedPulling="2026-04-17 14:43:15.245649124 +0000 UTC m=+548.113034753" lastFinishedPulling="2026-04-17 14:43:16.87731344 +0000 UTC m=+549.744699072" observedRunningTime="2026-04-17 14:43:17.610369105 +0000 UTC m=+550.477754756" watchObservedRunningTime="2026-04-17 14:43:17.613106143 +0000 UTC m=+550.480491813" Apr 17 14:43:17.835781 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.835707 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:17.991282 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.991243 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srzvz\" (UniqueName: \"kubernetes.io/projected/6848af81-8a88-4643-b432-1a7c1cdd6887-kube-api-access-srzvz\") pod \"6848af81-8a88-4643-b432-1a7c1cdd6887\" (UID: \"6848af81-8a88-4643-b432-1a7c1cdd6887\") " Apr 17 14:43:17.993547 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:17.993523 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6848af81-8a88-4643-b432-1a7c1cdd6887-kube-api-access-srzvz" (OuterVolumeSpecName: "kube-api-access-srzvz") pod "6848af81-8a88-4643-b432-1a7c1cdd6887" (UID: "6848af81-8a88-4643-b432-1a7c1cdd6887"). InnerVolumeSpecName "kube-api-access-srzvz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:43:18.092395 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.092314 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srzvz\" (UniqueName: \"kubernetes.io/projected/6848af81-8a88-4643-b432-1a7c1cdd6887-kube-api-access-srzvz\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:18.590236 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.590198 2582 generic.go:358] "Generic (PLEG): container finished" podID="6848af81-8a88-4643-b432-1a7c1cdd6887" containerID="69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83" exitCode=0 Apr 17 14:43:18.590630 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.590261 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" Apr 17 14:43:18.590630 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.590286 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" event={"ID":"6848af81-8a88-4643-b432-1a7c1cdd6887","Type":"ContainerDied","Data":"69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83"} Apr 17 14:43:18.590630 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.590330 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mbx5g" event={"ID":"6848af81-8a88-4643-b432-1a7c1cdd6887","Type":"ContainerDied","Data":"945485ccb086978c35573f182129075617c060ddb2a6f307103f7e4f64506fe8"} Apr 17 14:43:18.590630 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.590352 2582 scope.go:117] "RemoveContainer" containerID="69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83" Apr 17 14:43:18.599533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.599510 2582 scope.go:117] "RemoveContainer" containerID="69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83" Apr 17 14:43:18.599943 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:43:18.599922 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83\": container with ID starting with 69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83 not found: ID does not exist" containerID="69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83" Apr 17 14:43:18.600001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.599955 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83"} err="failed to get container status \"69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83\": rpc error: code = NotFound desc = could not find container \"69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83\": container with ID starting with 69ca662f934238a9e99118c947e8a8c53a523ea4b96a40f9a5c3c706876afb83 not found: ID does not exist" Apr 17 14:43:18.611044 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.611017 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mbx5g"] Apr 17 14:43:18.612991 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:18.612962 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mbx5g"] Apr 17 14:43:19.743866 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:19.743830 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6848af81-8a88-4643-b432-1a7c1cdd6887" path="/var/lib/kubelet/pods/6848af81-8a88-4643-b432-1a7c1cdd6887/volumes" Apr 17 14:43:25.099495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:25.099460 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:25.099943 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:25.099604 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:25.122044 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:25.122015 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:25.637106 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:25.637078 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-lgvvl" Apr 17 14:43:29.416458 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.416420 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8"] Apr 17 14:43:29.416888 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.416764 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6848af81-8a88-4643-b432-1a7c1cdd6887" containerName="registry-server" Apr 17 14:43:29.416888 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.416789 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="6848af81-8a88-4643-b432-1a7c1cdd6887" containerName="registry-server" Apr 17 14:43:29.416888 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.416872 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="6848af81-8a88-4643-b432-1a7c1cdd6887" containerName="registry-server" Apr 17 14:43:29.424824 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.424782 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.426193 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.426162 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8"] Apr 17 14:43:29.427118 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.427099 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-f6hxm\"" Apr 17 14:43:29.481992 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.481942 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrltb\" (UniqueName: \"kubernetes.io/projected/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-kube-api-access-qrltb\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.482172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.482028 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.482172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.482059 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.582653 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.582607 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.582847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.582662 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.582847 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.582724 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrltb\" (UniqueName: \"kubernetes.io/projected/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-kube-api-access-qrltb\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.583033 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.583012 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.583123 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.583080 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.591547 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.591519 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrltb\" (UniqueName: \"kubernetes.io/projected/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-kube-api-access-qrltb\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.736560 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.736456 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:29.817459 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.817424 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t"] Apr 17 14:43:29.850860 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.850682 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.854650 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.854593 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t"] Apr 17 14:43:29.885917 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.885883 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npww5\" (UniqueName: \"kubernetes.io/projected/243de439-5be8-46ad-a8ce-3b20ee27fdb6-kube-api-access-npww5\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.886088 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.885943 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.886088 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.885966 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.895049 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.895023 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8"] Apr 17 14:43:29.897503 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:29.897477 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897b8f66_fa5f_4f88_8b95_fbe02babf7ba.slice/crio-d9e25e351fa9b6de2126dec7d50fd448a380f9de2be5151f832cc5b11ddc5a81 WatchSource:0}: Error finding container d9e25e351fa9b6de2126dec7d50fd448a380f9de2be5151f832cc5b11ddc5a81: Status 404 returned error can't find the container with id d9e25e351fa9b6de2126dec7d50fd448a380f9de2be5151f832cc5b11ddc5a81 Apr 17 14:43:29.987087 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.986988 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npww5\" (UniqueName: \"kubernetes.io/projected/243de439-5be8-46ad-a8ce-3b20ee27fdb6-kube-api-access-npww5\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.987087 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.987075 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.987332 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.987109 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.987506 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.987484 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.987570 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.987554 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:29.995649 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:29.995619 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npww5\" (UniqueName: \"kubernetes.io/projected/243de439-5be8-46ad-a8ce-3b20ee27fdb6-kube-api-access-npww5\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:30.162930 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.162892 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:30.289225 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.289201 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t"] Apr 17 14:43:30.291345 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:30.291313 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243de439_5be8_46ad_a8ce_3b20ee27fdb6.slice/crio-84829262f10fc04a00107ff460de740bf2337912a89eb66fad7cd99eab2e510c WatchSource:0}: Error finding container 84829262f10fc04a00107ff460de740bf2337912a89eb66fad7cd99eab2e510c: Status 404 returned error can't find the container with id 84829262f10fc04a00107ff460de740bf2337912a89eb66fad7cd99eab2e510c Apr 17 14:43:30.412839 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.412786 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd"] Apr 17 14:43:30.416321 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.416304 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.422775 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.422745 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd"] Apr 17 14:43:30.490561 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.490466 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.490561 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.490523 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrf6\" (UniqueName: \"kubernetes.io/projected/1768a071-915d-4020-99b0-ed685dda3a5c-kube-api-access-6vrf6\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.490754 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.490619 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.591738 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.591700 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.591914 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.591781 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.591914 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.591852 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrf6\" (UniqueName: \"kubernetes.io/projected/1768a071-915d-4020-99b0-ed685dda3a5c-kube-api-access-6vrf6\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.592155 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.592132 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.592219 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.592165 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.599511 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.599485 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrf6\" (UniqueName: \"kubernetes.io/projected/1768a071-915d-4020-99b0-ed685dda3a5c-kube-api-access-6vrf6\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.634869 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.634836 2582 generic.go:358] "Generic (PLEG): container finished" podID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerID="d59fc6c96dfe8fdb2217efb23cef4bd686d63a98c1fc301d6ae537fbeb74ff7b" exitCode=0 Apr 17 14:43:30.635047 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.634908 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" event={"ID":"243de439-5be8-46ad-a8ce-3b20ee27fdb6","Type":"ContainerDied","Data":"d59fc6c96dfe8fdb2217efb23cef4bd686d63a98c1fc301d6ae537fbeb74ff7b"} Apr 17 14:43:30.635047 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.634930 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" event={"ID":"243de439-5be8-46ad-a8ce-3b20ee27fdb6","Type":"ContainerStarted","Data":"84829262f10fc04a00107ff460de740bf2337912a89eb66fad7cd99eab2e510c"} Apr 17 14:43:30.636346 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.636318 2582 generic.go:358] "Generic (PLEG): container finished" podID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerID="eb0da1d3256ad46e6f821e2ddcb197832eeda2021c257ce047b7d03bb2a4f3da" exitCode=0 Apr 17 14:43:30.636472 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.636407 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" event={"ID":"897b8f66-fa5f-4f88-8b95-fbe02babf7ba","Type":"ContainerDied","Data":"eb0da1d3256ad46e6f821e2ddcb197832eeda2021c257ce047b7d03bb2a4f3da"} Apr 17 14:43:30.636534 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.636468 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" event={"ID":"897b8f66-fa5f-4f88-8b95-fbe02babf7ba","Type":"ContainerStarted","Data":"d9e25e351fa9b6de2126dec7d50fd448a380f9de2be5151f832cc5b11ddc5a81"} Apr 17 14:43:30.752983 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.752948 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:30.884772 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:30.884742 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd"] Apr 17 14:43:30.886862 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:30.886834 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1768a071_915d_4020_99b0_ed685dda3a5c.slice/crio-a8be3c69fe0552cc6714f30c04c1318402c6cb2e57c74c9c2ecc1a73e989a090 WatchSource:0}: Error finding container a8be3c69fe0552cc6714f30c04c1318402c6cb2e57c74c9c2ecc1a73e989a090: Status 404 returned error can't find the container with id a8be3c69fe0552cc6714f30c04c1318402c6cb2e57c74c9c2ecc1a73e989a090 Apr 17 14:43:31.013525 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.013440 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h"] Apr 17 14:43:31.017067 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.017049 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.023034 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.023002 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h"] Apr 17 14:43:31.095874 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.095828 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbn7d\" (UniqueName: \"kubernetes.io/projected/07774c38-aa20-409e-bfa2-e7a68c224bf6-kube-api-access-xbn7d\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.096031 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.095886 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.096031 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.095909 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.196478 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.196437 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbn7d\" (UniqueName: \"kubernetes.io/projected/07774c38-aa20-409e-bfa2-e7a68c224bf6-kube-api-access-xbn7d\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.196633 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.196488 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.196633 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.196505 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.196918 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.196900 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.196974 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.196941 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.204537 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.204509 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbn7d\" (UniqueName: \"kubernetes.io/projected/07774c38-aa20-409e-bfa2-e7a68c224bf6-kube-api-access-xbn7d\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.345289 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.345207 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:31.506764 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.506734 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h"] Apr 17 14:43:31.533049 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:31.533009 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07774c38_aa20_409e_bfa2_e7a68c224bf6.slice/crio-dafdd93fd2ac7408fe898a658f8d621bf8693fcdf355f479967244e32e1ee837 WatchSource:0}: Error finding container dafdd93fd2ac7408fe898a658f8d621bf8693fcdf355f479967244e32e1ee837: Status 404 returned error can't find the container with id dafdd93fd2ac7408fe898a658f8d621bf8693fcdf355f479967244e32e1ee837 Apr 17 14:43:31.642939 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.642900 2582 generic.go:358] "Generic (PLEG): container finished" podID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerID="d0c670cc15d9025af3df470badd597808d6fa1fdd68411127ece1029d8679f86" exitCode=0 Apr 17 14:43:31.643124 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.642959 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" event={"ID":"243de439-5be8-46ad-a8ce-3b20ee27fdb6","Type":"ContainerDied","Data":"d0c670cc15d9025af3df470badd597808d6fa1fdd68411127ece1029d8679f86"} Apr 17 14:43:31.644730 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.644708 2582 generic.go:358] "Generic (PLEG): container finished" podID="1768a071-915d-4020-99b0-ed685dda3a5c" containerID="a79b19b3c49f2b9eaa40745ef2fb0a41c5bcabaf69e0dd0cb9ee6256cb1cea07" exitCode=0 Apr 17 14:43:31.644917 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.644887 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" event={"ID":"1768a071-915d-4020-99b0-ed685dda3a5c","Type":"ContainerDied","Data":"a79b19b3c49f2b9eaa40745ef2fb0a41c5bcabaf69e0dd0cb9ee6256cb1cea07"} Apr 17 14:43:31.644917 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.644917 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" event={"ID":"1768a071-915d-4020-99b0-ed685dda3a5c","Type":"ContainerStarted","Data":"a8be3c69fe0552cc6714f30c04c1318402c6cb2e57c74c9c2ecc1a73e989a090"} Apr 17 14:43:31.646704 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.646678 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" event={"ID":"07774c38-aa20-409e-bfa2-e7a68c224bf6","Type":"ContainerStarted","Data":"f17cd0d6c0400c94fbcd5dcb1b466c68b498d21fbb17fb8bc2277db178399064"} Apr 17 14:43:31.646839 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:31.646712 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" event={"ID":"07774c38-aa20-409e-bfa2-e7a68c224bf6","Type":"ContainerStarted","Data":"dafdd93fd2ac7408fe898a658f8d621bf8693fcdf355f479967244e32e1ee837"} Apr 17 14:43:32.652401 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.652359 2582 generic.go:358] "Generic (PLEG): container finished" podID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerID="f17cd0d6c0400c94fbcd5dcb1b466c68b498d21fbb17fb8bc2277db178399064" exitCode=0 Apr 17 14:43:32.652862 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.652457 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" event={"ID":"07774c38-aa20-409e-bfa2-e7a68c224bf6","Type":"ContainerDied","Data":"f17cd0d6c0400c94fbcd5dcb1b466c68b498d21fbb17fb8bc2277db178399064"} Apr 17 14:43:32.654479 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.654456 2582 generic.go:358] "Generic (PLEG): container finished" podID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerID="e997aa62c752878199ba9adb9856da801a34af2c3a49aa65192fb4fa56f23fb7" exitCode=0 Apr 17 14:43:32.654582 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.654523 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" event={"ID":"243de439-5be8-46ad-a8ce-3b20ee27fdb6","Type":"ContainerDied","Data":"e997aa62c752878199ba9adb9856da801a34af2c3a49aa65192fb4fa56f23fb7"} Apr 17 14:43:32.656229 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.656206 2582 generic.go:358] "Generic (PLEG): container finished" podID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerID="17c0c00b9de1c21bd5537369058e114fec467ce635647fcbd5a5355c26561f5e" exitCode=0 Apr 17 14:43:32.656351 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.656328 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" event={"ID":"897b8f66-fa5f-4f88-8b95-fbe02babf7ba","Type":"ContainerDied","Data":"17c0c00b9de1c21bd5537369058e114fec467ce635647fcbd5a5355c26561f5e"} Apr 17 14:43:32.657894 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.657875 2582 generic.go:358] "Generic (PLEG): container finished" podID="1768a071-915d-4020-99b0-ed685dda3a5c" containerID="dbd43318d7c35b6897a4f1d57014eb03a9174b9405e637ee0427e36f54ff63a6" exitCode=0 Apr 17 14:43:32.657963 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:32.657942 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" event={"ID":"1768a071-915d-4020-99b0-ed685dda3a5c","Type":"ContainerDied","Data":"dbd43318d7c35b6897a4f1d57014eb03a9174b9405e637ee0427e36f54ff63a6"} Apr 17 14:43:33.664419 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.664337 2582 generic.go:358] "Generic (PLEG): container finished" podID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerID="3f1fdc2a196e9224018efe73632907d01acc4c28417cff0f4227369f0c5fbe08" exitCode=0 Apr 17 14:43:33.664797 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.664425 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" event={"ID":"07774c38-aa20-409e-bfa2-e7a68c224bf6","Type":"ContainerDied","Data":"3f1fdc2a196e9224018efe73632907d01acc4c28417cff0f4227369f0c5fbe08"} Apr 17 14:43:33.666370 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.666341 2582 generic.go:358] "Generic (PLEG): container finished" podID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerID="16e5de4fb77320ec5cebe2736dc711a1a49eb3d877b348bd173d1d2de11435e7" exitCode=0 Apr 17 14:43:33.666476 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.666378 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" event={"ID":"897b8f66-fa5f-4f88-8b95-fbe02babf7ba","Type":"ContainerDied","Data":"16e5de4fb77320ec5cebe2736dc711a1a49eb3d877b348bd173d1d2de11435e7"} Apr 17 14:43:33.674036 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.674009 2582 generic.go:358] "Generic (PLEG): container finished" podID="1768a071-915d-4020-99b0-ed685dda3a5c" containerID="90f47c2896aae31d8f1e7a30e48cea7cb1a36d74923116b81eb168fcf7331119" exitCode=0 Apr 17 14:43:33.674137 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.674100 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" event={"ID":"1768a071-915d-4020-99b0-ed685dda3a5c","Type":"ContainerDied","Data":"90f47c2896aae31d8f1e7a30e48cea7cb1a36d74923116b81eb168fcf7331119"} Apr 17 14:43:33.808074 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.808045 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:33.821482 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.821451 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-bundle\") pod \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " Apr 17 14:43:33.821604 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.821494 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-util\") pod \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " Apr 17 14:43:33.821604 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.821528 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npww5\" (UniqueName: \"kubernetes.io/projected/243de439-5be8-46ad-a8ce-3b20ee27fdb6-kube-api-access-npww5\") pod \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\" (UID: \"243de439-5be8-46ad-a8ce-3b20ee27fdb6\") " Apr 17 14:43:33.822057 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.822032 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-bundle" (OuterVolumeSpecName: "bundle") pod "243de439-5be8-46ad-a8ce-3b20ee27fdb6" (UID: "243de439-5be8-46ad-a8ce-3b20ee27fdb6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:33.823772 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.823744 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243de439-5be8-46ad-a8ce-3b20ee27fdb6-kube-api-access-npww5" (OuterVolumeSpecName: "kube-api-access-npww5") pod "243de439-5be8-46ad-a8ce-3b20ee27fdb6" (UID: "243de439-5be8-46ad-a8ce-3b20ee27fdb6"). InnerVolumeSpecName "kube-api-access-npww5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:43:33.827080 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.827053 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-util" (OuterVolumeSpecName: "util") pod "243de439-5be8-46ad-a8ce-3b20ee27fdb6" (UID: "243de439-5be8-46ad-a8ce-3b20ee27fdb6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:33.922235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.922131 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:33.922235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.922169 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/243de439-5be8-46ad-a8ce-3b20ee27fdb6-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:33.922235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:33.922180 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npww5\" (UniqueName: \"kubernetes.io/projected/243de439-5be8-46ad-a8ce-3b20ee27fdb6-kube-api-access-npww5\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:34.614024 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.613984 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7648f79794-9hjrz"] Apr 17 14:43:34.614549 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614527 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="pull" Apr 17 14:43:34.614594 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614554 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="pull" Apr 17 14:43:34.614594 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614583 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="util" Apr 17 14:43:34.614594 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614592 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="util" Apr 17 14:43:34.614687 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614606 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="extract" Apr 17 14:43:34.614687 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614615 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="extract" Apr 17 14:43:34.614758 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.614703 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="243de439-5be8-46ad-a8ce-3b20ee27fdb6" containerName="extract" Apr 17 14:43:34.617949 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.617929 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.622416 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.622371 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:43:34.622416 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.622391 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lw4t2\"" Apr 17 14:43:34.622645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.622381 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:43:34.622645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.622489 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:43:34.622645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.622389 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:43:34.622645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.622375 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:43:34.627180 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.627156 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:43:34.627949 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.627926 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7648f79794-9hjrz"] Apr 17 14:43:34.629235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629209 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-console-config\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.629342 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629280 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-oauth-serving-cert\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.629397 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629359 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj79m\" (UniqueName: \"kubernetes.io/projected/2a3e1b8e-c193-404a-8b22-503df0af5366-kube-api-access-mj79m\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.629457 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629444 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a3e1b8e-c193-404a-8b22-503df0af5366-console-serving-cert\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.629520 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629472 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a3e1b8e-c193-404a-8b22-503df0af5366-console-oauth-config\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.629520 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629500 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-trusted-ca-bundle\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.629625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.629526 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-service-ca\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.681794 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.681378 2582 generic.go:358] "Generic (PLEG): container finished" podID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerID="f6c524cf7e955eb81c0daa66d173459e99713d31758ce60d539cedd1a6a5617a" exitCode=0 Apr 17 14:43:34.681794 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.681502 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" event={"ID":"07774c38-aa20-409e-bfa2-e7a68c224bf6","Type":"ContainerDied","Data":"f6c524cf7e955eb81c0daa66d173459e99713d31758ce60d539cedd1a6a5617a"} Apr 17 14:43:34.683956 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.683932 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" Apr 17 14:43:34.684062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.683955 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t" event={"ID":"243de439-5be8-46ad-a8ce-3b20ee27fdb6","Type":"ContainerDied","Data":"84829262f10fc04a00107ff460de740bf2337912a89eb66fad7cd99eab2e510c"} Apr 17 14:43:34.684062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.683989 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84829262f10fc04a00107ff460de740bf2337912a89eb66fad7cd99eab2e510c" Apr 17 14:43:34.730602 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730563 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a3e1b8e-c193-404a-8b22-503df0af5366-console-serving-cert\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.730602 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730601 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a3e1b8e-c193-404a-8b22-503df0af5366-console-oauth-config\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.730889 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730620 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-trusted-ca-bundle\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.730889 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730642 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-service-ca\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.730889 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-console-config\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.730889 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730725 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-oauth-serving-cert\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.730889 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.730773 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj79m\" (UniqueName: \"kubernetes.io/projected/2a3e1b8e-c193-404a-8b22-503df0af5366-kube-api-access-mj79m\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.732204 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.732137 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-trusted-ca-bundle\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.732538 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.732421 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-oauth-serving-cert\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.732538 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.732492 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-console-config\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.732538 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.732511 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a3e1b8e-c193-404a-8b22-503df0af5366-service-ca\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.734149 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.734086 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a3e1b8e-c193-404a-8b22-503df0af5366-console-serving-cert\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.734247 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.734202 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a3e1b8e-c193-404a-8b22-503df0af5366-console-oauth-config\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.739137 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.739110 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj79m\" (UniqueName: \"kubernetes.io/projected/2a3e1b8e-c193-404a-8b22-503df0af5366-kube-api-access-mj79m\") pod \"console-7648f79794-9hjrz\" (UID: \"2a3e1b8e-c193-404a-8b22-503df0af5366\") " pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.829424 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.829400 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:34.851501 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.851473 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:34.930011 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.929919 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:34.932780 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.932756 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-util\") pod \"1768a071-915d-4020-99b0-ed685dda3a5c\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " Apr 17 14:43:34.932893 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.932847 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-util\") pod \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " Apr 17 14:43:34.932893 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.932886 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-bundle\") pod \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " Apr 17 14:43:34.932970 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.932903 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrltb\" (UniqueName: \"kubernetes.io/projected/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-kube-api-access-qrltb\") pod \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\" (UID: \"897b8f66-fa5f-4f88-8b95-fbe02babf7ba\") " Apr 17 14:43:34.932970 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.932919 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vrf6\" (UniqueName: \"kubernetes.io/projected/1768a071-915d-4020-99b0-ed685dda3a5c-kube-api-access-6vrf6\") pod \"1768a071-915d-4020-99b0-ed685dda3a5c\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " Apr 17 14:43:34.932970 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.932939 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-bundle\") pod \"1768a071-915d-4020-99b0-ed685dda3a5c\" (UID: \"1768a071-915d-4020-99b0-ed685dda3a5c\") " Apr 17 14:43:34.933490 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.933454 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-bundle" (OuterVolumeSpecName: "bundle") pod "897b8f66-fa5f-4f88-8b95-fbe02babf7ba" (UID: "897b8f66-fa5f-4f88-8b95-fbe02babf7ba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:34.933771 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.933569 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-bundle" (OuterVolumeSpecName: "bundle") pod "1768a071-915d-4020-99b0-ed685dda3a5c" (UID: "1768a071-915d-4020-99b0-ed685dda3a5c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:34.935781 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.935753 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1768a071-915d-4020-99b0-ed685dda3a5c-kube-api-access-6vrf6" (OuterVolumeSpecName: "kube-api-access-6vrf6") pod "1768a071-915d-4020-99b0-ed685dda3a5c" (UID: "1768a071-915d-4020-99b0-ed685dda3a5c"). InnerVolumeSpecName "kube-api-access-6vrf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:43:34.935781 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.935760 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-kube-api-access-qrltb" (OuterVolumeSpecName: "kube-api-access-qrltb") pod "897b8f66-fa5f-4f88-8b95-fbe02babf7ba" (UID: "897b8f66-fa5f-4f88-8b95-fbe02babf7ba"). InnerVolumeSpecName "kube-api-access-qrltb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:43:34.938724 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.938700 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-util" (OuterVolumeSpecName: "util") pod "1768a071-915d-4020-99b0-ed685dda3a5c" (UID: "1768a071-915d-4020-99b0-ed685dda3a5c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:34.940864 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:34.940835 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-util" (OuterVolumeSpecName: "util") pod "897b8f66-fa5f-4f88-8b95-fbe02babf7ba" (UID: "897b8f66-fa5f-4f88-8b95-fbe02babf7ba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:35.033761 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.033716 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrltb\" (UniqueName: \"kubernetes.io/projected/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-kube-api-access-qrltb\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.033761 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.033761 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vrf6\" (UniqueName: \"kubernetes.io/projected/1768a071-915d-4020-99b0-ed685dda3a5c-kube-api-access-6vrf6\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.033954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.033781 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.033954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.033797 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1768a071-915d-4020-99b0-ed685dda3a5c-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.033954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.033840 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.033954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.033856 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/897b8f66-fa5f-4f88-8b95-fbe02babf7ba-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.061048 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.061017 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7648f79794-9hjrz"] Apr 17 14:43:35.062515 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:35.062477 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3e1b8e_c193_404a_8b22_503df0af5366.slice/crio-6f4e7277efc76191d6b4e26d08e8787679562bd42e991bc82491e4005d4e3a01 WatchSource:0}: Error finding container 6f4e7277efc76191d6b4e26d08e8787679562bd42e991bc82491e4005d4e3a01: Status 404 returned error can't find the container with id 6f4e7277efc76191d6b4e26d08e8787679562bd42e991bc82491e4005d4e3a01 Apr 17 14:43:35.689901 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.689871 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" Apr 17 14:43:35.690338 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.689867 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd" event={"ID":"1768a071-915d-4020-99b0-ed685dda3a5c","Type":"ContainerDied","Data":"a8be3c69fe0552cc6714f30c04c1318402c6cb2e57c74c9c2ecc1a73e989a090"} Apr 17 14:43:35.690338 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.690004 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8be3c69fe0552cc6714f30c04c1318402c6cb2e57c74c9c2ecc1a73e989a090" Apr 17 14:43:35.691611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.691589 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" Apr 17 14:43:35.691750 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.691586 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8" event={"ID":"897b8f66-fa5f-4f88-8b95-fbe02babf7ba","Type":"ContainerDied","Data":"d9e25e351fa9b6de2126dec7d50fd448a380f9de2be5151f832cc5b11ddc5a81"} Apr 17 14:43:35.691750 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.691699 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e25e351fa9b6de2126dec7d50fd448a380f9de2be5151f832cc5b11ddc5a81" Apr 17 14:43:35.692956 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.692922 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7648f79794-9hjrz" event={"ID":"2a3e1b8e-c193-404a-8b22-503df0af5366","Type":"ContainerStarted","Data":"b7bbb60034d35f4e2eec5c979e1efd4db6d4d00cf8b4cbb8853df02b82d53c17"} Apr 17 14:43:35.692956 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.692955 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7648f79794-9hjrz" event={"ID":"2a3e1b8e-c193-404a-8b22-503df0af5366","Type":"ContainerStarted","Data":"6f4e7277efc76191d6b4e26d08e8787679562bd42e991bc82491e4005d4e3a01"} Apr 17 14:43:35.709884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.709831 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7648f79794-9hjrz" podStartSLOduration=1.709789783 podStartE2EDuration="1.709789783s" podCreationTimestamp="2026-04-17 14:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:43:35.708917786 +0000 UTC m=+568.576303436" watchObservedRunningTime="2026-04-17 14:43:35.709789783 +0000 UTC m=+568.577175433" Apr 17 14:43:35.819317 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.819291 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:35.841284 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.841249 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbn7d\" (UniqueName: \"kubernetes.io/projected/07774c38-aa20-409e-bfa2-e7a68c224bf6-kube-api-access-xbn7d\") pod \"07774c38-aa20-409e-bfa2-e7a68c224bf6\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " Apr 17 14:43:35.841461 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.841323 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-bundle\") pod \"07774c38-aa20-409e-bfa2-e7a68c224bf6\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " Apr 17 14:43:35.841461 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.841419 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-util\") pod \"07774c38-aa20-409e-bfa2-e7a68c224bf6\" (UID: \"07774c38-aa20-409e-bfa2-e7a68c224bf6\") " Apr 17 14:43:35.842109 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.842076 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-bundle" (OuterVolumeSpecName: "bundle") pod "07774c38-aa20-409e-bfa2-e7a68c224bf6" (UID: "07774c38-aa20-409e-bfa2-e7a68c224bf6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:35.843575 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.843546 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07774c38-aa20-409e-bfa2-e7a68c224bf6-kube-api-access-xbn7d" (OuterVolumeSpecName: "kube-api-access-xbn7d") pod "07774c38-aa20-409e-bfa2-e7a68c224bf6" (UID: "07774c38-aa20-409e-bfa2-e7a68c224bf6"). InnerVolumeSpecName "kube-api-access-xbn7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:43:35.848981 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.848949 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-util" (OuterVolumeSpecName: "util") pod "07774c38-aa20-409e-bfa2-e7a68c224bf6" (UID: "07774c38-aa20-409e-bfa2-e7a68c224bf6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:43:35.942692 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.942606 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbn7d\" (UniqueName: \"kubernetes.io/projected/07774c38-aa20-409e-bfa2-e7a68c224bf6-kube-api-access-xbn7d\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.942692 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.942637 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-bundle\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:35.942692 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:35.942649 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07774c38-aa20-409e-bfa2-e7a68c224bf6-util\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:43:36.697857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:36.697788 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" event={"ID":"07774c38-aa20-409e-bfa2-e7a68c224bf6","Type":"ContainerDied","Data":"dafdd93fd2ac7408fe898a658f8d621bf8693fcdf355f479967244e32e1ee837"} Apr 17 14:43:36.697857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:36.697854 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dafdd93fd2ac7408fe898a658f8d621bf8693fcdf355f479967244e32e1ee837" Apr 17 14:43:36.698244 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:36.698000 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h" Apr 17 14:43:44.930543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:44.930492 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:44.930543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:44.930548 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:44.935622 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:44.935594 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:45.742556 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:45.742523 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7648f79794-9hjrz" Apr 17 14:43:49.214603 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.214562 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn"] Apr 17 14:43:49.215001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.214955 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="util" Apr 17 14:43:49.215001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.214966 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="util" Apr 17 14:43:49.215001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.214975 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="pull" Apr 17 14:43:49.215001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.214981 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="pull" Apr 17 14:43:49.215001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.214993 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="extract" Apr 17 14:43:49.215001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215000 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215009 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215014 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215021 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="util" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215027 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="util" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215032 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="pull" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215037 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="pull" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215047 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="util" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215051 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="util" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215059 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="pull" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215065 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="pull" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215070 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215074 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215141 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1768a071-915d-4020-99b0-ed685dda3a5c" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215151 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="897b8f66-fa5f-4f88-8b95-fbe02babf7ba" containerName="extract" Apr 17 14:43:49.215184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.215158 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="07774c38-aa20-409e-bfa2-e7a68c224bf6" containerName="extract" Apr 17 14:43:49.219053 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.219034 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:43:49.222023 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.222002 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-p8kzh\"" Apr 17 14:43:49.228551 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.228522 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn"] Apr 17 14:43:49.256455 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.256421 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmx47\" (UniqueName: \"kubernetes.io/projected/986c86fc-f849-46e0-9e9a-57e2d40d8e5b-kube-api-access-cmx47\") pod \"limitador-operator-controller-manager-85c4996f8c-flbwn\" (UID: \"986c86fc-f849-46e0-9e9a-57e2d40d8e5b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:43:49.357737 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.357699 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmx47\" (UniqueName: \"kubernetes.io/projected/986c86fc-f849-46e0-9e9a-57e2d40d8e5b-kube-api-access-cmx47\") pod \"limitador-operator-controller-manager-85c4996f8c-flbwn\" (UID: \"986c86fc-f849-46e0-9e9a-57e2d40d8e5b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:43:49.377746 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.377703 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmx47\" (UniqueName: \"kubernetes.io/projected/986c86fc-f849-46e0-9e9a-57e2d40d8e5b-kube-api-access-cmx47\") pod \"limitador-operator-controller-manager-85c4996f8c-flbwn\" (UID: \"986c86fc-f849-46e0-9e9a-57e2d40d8e5b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:43:49.531178 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.531141 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:43:49.663956 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.663860 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn"] Apr 17 14:43:49.666851 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:49.666819 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986c86fc_f849_46e0_9e9a_57e2d40d8e5b.slice/crio-194f2eeb93e0327c3d3ebfb16acf2b69281f0e60e589da96b8b05c273219120b WatchSource:0}: Error finding container 194f2eeb93e0327c3d3ebfb16acf2b69281f0e60e589da96b8b05c273219120b: Status 404 returned error can't find the container with id 194f2eeb93e0327c3d3ebfb16acf2b69281f0e60e589da96b8b05c273219120b Apr 17 14:43:49.750558 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:49.750521 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" event={"ID":"986c86fc-f849-46e0-9e9a-57e2d40d8e5b","Type":"ContainerStarted","Data":"194f2eeb93e0327c3d3ebfb16acf2b69281f0e60e589da96b8b05c273219120b"} Apr 17 14:43:51.759694 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:51.759658 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" event={"ID":"986c86fc-f849-46e0-9e9a-57e2d40d8e5b","Type":"ContainerStarted","Data":"3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55"} Apr 17 14:43:51.760211 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:51.759843 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:43:51.776400 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:51.776343 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" podStartSLOduration=1.004947642 podStartE2EDuration="2.776325699s" podCreationTimestamp="2026-04-17 14:43:49 +0000 UTC" firstStartedPulling="2026-04-17 14:43:49.668816803 +0000 UTC m=+582.536202435" lastFinishedPulling="2026-04-17 14:43:51.44019486 +0000 UTC m=+584.307580492" observedRunningTime="2026-04-17 14:43:51.774674716 +0000 UTC m=+584.642060366" watchObservedRunningTime="2026-04-17 14:43:51.776325699 +0000 UTC m=+584.643711350" Apr 17 14:43:56.815296 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:56.815258 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m"] Apr 17 14:43:56.818596 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:56.818579 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:43:56.820970 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:56.820944 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 14:43:56.821093 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:56.821008 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-qw8rk\"" Apr 17 14:43:56.831479 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:56.831454 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m"] Apr 17 14:43:56.931704 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:56.931666 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxbnk\" (UniqueName: \"kubernetes.io/projected/43e321c6-e135-4b2b-a15d-87bc335ca9ad-kube-api-access-fxbnk\") pod \"dns-operator-controller-manager-648d5c98bc-7kw9m\" (UID: \"43e321c6-e135-4b2b-a15d-87bc335ca9ad\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:43:57.033020 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:57.032981 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxbnk\" (UniqueName: \"kubernetes.io/projected/43e321c6-e135-4b2b-a15d-87bc335ca9ad-kube-api-access-fxbnk\") pod \"dns-operator-controller-manager-648d5c98bc-7kw9m\" (UID: \"43e321c6-e135-4b2b-a15d-87bc335ca9ad\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:43:57.041566 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:57.041532 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxbnk\" (UniqueName: \"kubernetes.io/projected/43e321c6-e135-4b2b-a15d-87bc335ca9ad-kube-api-access-fxbnk\") pod \"dns-operator-controller-manager-648d5c98bc-7kw9m\" (UID: \"43e321c6-e135-4b2b-a15d-87bc335ca9ad\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:43:57.129000 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:57.128908 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:43:57.255719 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:57.255692 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m"] Apr 17 14:43:57.257612 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:43:57.257581 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e321c6_e135_4b2b_a15d_87bc335ca9ad.slice/crio-7a6c91d2506a57651f8f61acb9427d9008a4a2151a20dde791c55e5586618c7f WatchSource:0}: Error finding container 7a6c91d2506a57651f8f61acb9427d9008a4a2151a20dde791c55e5586618c7f: Status 404 returned error can't find the container with id 7a6c91d2506a57651f8f61acb9427d9008a4a2151a20dde791c55e5586618c7f Apr 17 14:43:57.783132 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:57.783089 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" event={"ID":"43e321c6-e135-4b2b-a15d-87bc335ca9ad","Type":"ContainerStarted","Data":"7a6c91d2506a57651f8f61acb9427d9008a4a2151a20dde791c55e5586618c7f"} Apr 17 14:43:59.793383 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:59.793338 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" event={"ID":"43e321c6-e135-4b2b-a15d-87bc335ca9ad","Type":"ContainerStarted","Data":"a8cef1b534c8c51525ebb469bd5010f8867b9cbc3293ca8959c0390f2222a488"} Apr 17 14:43:59.793748 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:59.793512 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:43:59.813167 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:43:59.813093 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" podStartSLOduration=1.487943269 podStartE2EDuration="3.813072217s" podCreationTimestamp="2026-04-17 14:43:56 +0000 UTC" firstStartedPulling="2026-04-17 14:43:57.259615156 +0000 UTC m=+590.127000786" lastFinishedPulling="2026-04-17 14:43:59.584744104 +0000 UTC m=+592.452129734" observedRunningTime="2026-04-17 14:43:59.807625525 +0000 UTC m=+592.675011177" watchObservedRunningTime="2026-04-17 14:43:59.813072217 +0000 UTC m=+592.680457869" Apr 17 14:44:02.765782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:02.765744 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:44:07.653051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:07.653018 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:44:07.653051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:07.653047 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:44:10.799715 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:10.799675 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7kw9m" Apr 17 14:44:12.465760 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.465721 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn"] Apr 17 14:44:12.466205 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.465960 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" containerName="manager" containerID="cri-o://3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55" gracePeriod=2 Apr 17 14:44:12.485740 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.485706 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn"] Apr 17 14:44:12.496932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.496896 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2"] Apr 17 14:44:12.497259 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.497247 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" containerName="manager" Apr 17 14:44:12.497330 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.497260 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" containerName="manager" Apr 17 14:44:12.497330 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.497328 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" containerName="manager" Apr 17 14:44:12.500490 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.500470 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:12.512172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.512142 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2"] Apr 17 14:44:12.554240 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.554205 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5"] Apr 17 14:44:12.557683 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.557661 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.560363 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.560334 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-nll7t\"" Apr 17 14:44:12.568029 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.568002 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5"] Apr 17 14:44:12.676429 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.676387 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3106d538-3643-42c5-92cb-dbea4af3a682-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5\" (UID: \"3106d538-3643-42c5-92cb-dbea4af3a682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.676625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.676444 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blnf\" (UniqueName: \"kubernetes.io/projected/3106d538-3643-42c5-92cb-dbea4af3a682-kube-api-access-9blnf\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5\" (UID: \"3106d538-3643-42c5-92cb-dbea4af3a682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.676625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.676481 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmzx\" (UniqueName: \"kubernetes.io/projected/84c2a07d-f1ec-4180-ac4b-baf6c4323db5-kube-api-access-zzmzx\") pod \"limitador-operator-controller-manager-85c4996f8c-xk7f2\" (UID: \"84c2a07d-f1ec-4180-ac4b-baf6c4323db5\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:12.700491 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.700467 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:44:12.712231 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.712196 2582 status_manager.go:895] "Failed to get status for pod" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" err="pods \"limitador-operator-controller-manager-85c4996f8c-flbwn\" is forbidden: User \"system:node:ip-10-0-143-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-171.ec2.internal' and this object" Apr 17 14:44:12.777227 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.777191 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3106d538-3643-42c5-92cb-dbea4af3a682-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5\" (UID: \"3106d538-3643-42c5-92cb-dbea4af3a682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.777373 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.777239 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9blnf\" (UniqueName: \"kubernetes.io/projected/3106d538-3643-42c5-92cb-dbea4af3a682-kube-api-access-9blnf\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5\" (UID: \"3106d538-3643-42c5-92cb-dbea4af3a682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.777373 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.777272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmzx\" (UniqueName: \"kubernetes.io/projected/84c2a07d-f1ec-4180-ac4b-baf6c4323db5-kube-api-access-zzmzx\") pod \"limitador-operator-controller-manager-85c4996f8c-xk7f2\" (UID: \"84c2a07d-f1ec-4180-ac4b-baf6c4323db5\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:12.777605 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.777584 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3106d538-3643-42c5-92cb-dbea4af3a682-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5\" (UID: \"3106d538-3643-42c5-92cb-dbea4af3a682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.792285 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.792250 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmzx\" (UniqueName: \"kubernetes.io/projected/84c2a07d-f1ec-4180-ac4b-baf6c4323db5-kube-api-access-zzmzx\") pod \"limitador-operator-controller-manager-85c4996f8c-xk7f2\" (UID: \"84c2a07d-f1ec-4180-ac4b-baf6c4323db5\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:12.795342 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.795315 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blnf\" (UniqueName: \"kubernetes.io/projected/3106d538-3643-42c5-92cb-dbea4af3a682-kube-api-access-9blnf\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5\" (UID: \"3106d538-3643-42c5-92cb-dbea4af3a682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.842287 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.842249 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:12.848953 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.848919 2582 generic.go:358] "Generic (PLEG): container finished" podID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" containerID="3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55" exitCode=0 Apr 17 14:44:12.849109 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.849053 2582 scope.go:117] "RemoveContainer" containerID="3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55" Apr 17 14:44:12.849172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.849050 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" Apr 17 14:44:12.851671 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.851636 2582 status_manager.go:895] "Failed to get status for pod" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" err="pods \"limitador-operator-controller-manager-85c4996f8c-flbwn\" is forbidden: User \"system:node:ip-10-0-143-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-171.ec2.internal' and this object" Apr 17 14:44:12.858989 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.858962 2582 scope.go:117] "RemoveContainer" containerID="3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55" Apr 17 14:44:12.859287 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:44:12.859267 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55\": container with ID starting with 3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55 not found: ID does not exist" containerID="3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55" Apr 17 14:44:12.859359 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.859309 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55"} err="failed to get container status \"3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55\": rpc error: code = NotFound desc = could not find container \"3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55\": container with ID starting with 3c5de9f8088db978e95308288c0bef13918a2df1a986620035fc2a169c1e3c55 not found: ID does not exist" Apr 17 14:44:12.868442 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.868409 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:12.878514 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.878481 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmx47\" (UniqueName: \"kubernetes.io/projected/986c86fc-f849-46e0-9e9a-57e2d40d8e5b-kube-api-access-cmx47\") pod \"986c86fc-f849-46e0-9e9a-57e2d40d8e5b\" (UID: \"986c86fc-f849-46e0-9e9a-57e2d40d8e5b\") " Apr 17 14:44:12.880745 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.880711 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986c86fc-f849-46e0-9e9a-57e2d40d8e5b-kube-api-access-cmx47" (OuterVolumeSpecName: "kube-api-access-cmx47") pod "986c86fc-f849-46e0-9e9a-57e2d40d8e5b" (UID: "986c86fc-f849-46e0-9e9a-57e2d40d8e5b"). InnerVolumeSpecName "kube-api-access-cmx47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:44:12.980229 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:12.980193 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmx47\" (UniqueName: \"kubernetes.io/projected/986c86fc-f849-46e0-9e9a-57e2d40d8e5b-kube-api-access-cmx47\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:44:13.002427 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.002392 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2"] Apr 17 14:44:13.004269 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:44:13.004240 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c2a07d_f1ec_4180_ac4b_baf6c4323db5.slice/crio-4eb3418d8e69b593874e9585f285560114d0c88405ddc6c96e8ea6e5e255bec0 WatchSource:0}: Error finding container 4eb3418d8e69b593874e9585f285560114d0c88405ddc6c96e8ea6e5e255bec0: Status 404 returned error can't find the container with id 4eb3418d8e69b593874e9585f285560114d0c88405ddc6c96e8ea6e5e255bec0 Apr 17 14:44:13.036609 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.036583 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5"] Apr 17 14:44:13.042031 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:44:13.042001 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3106d538_3643_42c5_92cb_dbea4af3a682.slice/crio-5c7641c4b55a3327e27564f640ec809fca44c791897d25f2c820ad1149555513 WatchSource:0}: Error finding container 5c7641c4b55a3327e27564f640ec809fca44c791897d25f2c820ad1149555513: Status 404 returned error can't find the container with id 5c7641c4b55a3327e27564f640ec809fca44c791897d25f2c820ad1149555513 Apr 17 14:44:13.159469 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.159437 2582 status_manager.go:895] "Failed to get status for pod" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-flbwn" err="pods \"limitador-operator-controller-manager-85c4996f8c-flbwn\" is forbidden: User \"system:node:ip-10-0-143-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-143-171.ec2.internal' and this object" Apr 17 14:44:13.744541 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.744500 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986c86fc-f849-46e0-9e9a-57e2d40d8e5b" path="/var/lib/kubelet/pods/986c86fc-f849-46e0-9e9a-57e2d40d8e5b/volumes" Apr 17 14:44:13.855271 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.855229 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" event={"ID":"3106d538-3643-42c5-92cb-dbea4af3a682","Type":"ContainerStarted","Data":"5c7641c4b55a3327e27564f640ec809fca44c791897d25f2c820ad1149555513"} Apr 17 14:44:13.857756 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.857720 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" event={"ID":"84c2a07d-f1ec-4180-ac4b-baf6c4323db5","Type":"ContainerStarted","Data":"1fa385d2bd37aea400051a11807c03933fa18349177a7fb1e00eb98ce50e0f03"} Apr 17 14:44:13.857756 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.857759 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" event={"ID":"84c2a07d-f1ec-4180-ac4b-baf6c4323db5","Type":"ContainerStarted","Data":"4eb3418d8e69b593874e9585f285560114d0c88405ddc6c96e8ea6e5e255bec0"} Apr 17 14:44:13.858003 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.857881 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:13.878431 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:13.878378 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" podStartSLOduration=1.878362649 podStartE2EDuration="1.878362649s" podCreationTimestamp="2026-04-17 14:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:44:13.876142526 +0000 UTC m=+606.743528177" watchObservedRunningTime="2026-04-17 14:44:13.878362649 +0000 UTC m=+606.745748327" Apr 17 14:44:17.877914 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:17.877871 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" event={"ID":"3106d538-3643-42c5-92cb-dbea4af3a682","Type":"ContainerStarted","Data":"a713e3fe7218367a84816caf26acf0a499199122c2c48182c3b56b4994b5bb84"} Apr 17 14:44:17.878389 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:17.877991 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:17.902402 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:17.902353 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" podStartSLOduration=2.115481877 podStartE2EDuration="5.902337668s" podCreationTimestamp="2026-04-17 14:44:12 +0000 UTC" firstStartedPulling="2026-04-17 14:44:13.044418582 +0000 UTC m=+605.911804215" lastFinishedPulling="2026-04-17 14:44:16.831274377 +0000 UTC m=+609.698660006" observedRunningTime="2026-04-17 14:44:17.89913438 +0000 UTC m=+610.766520032" watchObservedRunningTime="2026-04-17 14:44:17.902337668 +0000 UTC m=+610.769723319" Apr 17 14:44:24.866414 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:24.866333 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xk7f2" Apr 17 14:44:28.884623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:28.884590 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5" Apr 17 14:44:50.362025 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.361986 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-qjv2j"] Apr 17 14:44:50.366443 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.366415 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:44:50.368866 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.368837 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6slcb\"" Apr 17 14:44:50.371381 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.371356 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-qjv2j"] Apr 17 14:44:50.517140 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.517099 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plxzn\" (UniqueName: \"kubernetes.io/projected/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a-kube-api-access-plxzn\") pod \"authorino-7498df8756-qjv2j\" (UID: \"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a\") " pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:44:50.618279 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.618174 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plxzn\" (UniqueName: \"kubernetes.io/projected/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a-kube-api-access-plxzn\") pod \"authorino-7498df8756-qjv2j\" (UID: \"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a\") " pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:44:50.626196 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.626153 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plxzn\" (UniqueName: \"kubernetes.io/projected/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a-kube-api-access-plxzn\") pod \"authorino-7498df8756-qjv2j\" (UID: \"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a\") " pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:44:50.677185 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.677140 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:44:50.804169 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:50.804143 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-qjv2j"] Apr 17 14:44:50.806212 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:44:50.806169 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b4047c5_76cb_4b96_80f7_96c2d5af8f1a.slice/crio-b77d6a67f864701e442c7c91d6308bfff582dad8c670cc992caddfde84a713cb WatchSource:0}: Error finding container b77d6a67f864701e442c7c91d6308bfff582dad8c670cc992caddfde84a713cb: Status 404 returned error can't find the container with id b77d6a67f864701e442c7c91d6308bfff582dad8c670cc992caddfde84a713cb Apr 17 14:44:51.010321 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:51.010266 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qjv2j" event={"ID":"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a","Type":"ContainerStarted","Data":"b77d6a67f864701e442c7c91d6308bfff582dad8c670cc992caddfde84a713cb"} Apr 17 14:44:54.027299 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:54.027248 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qjv2j" event={"ID":"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a","Type":"ContainerStarted","Data":"125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb"} Apr 17 14:44:54.043785 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:44:54.043721 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-qjv2j" podStartSLOduration=1.66403517 podStartE2EDuration="4.043704957s" podCreationTimestamp="2026-04-17 14:44:50 +0000 UTC" firstStartedPulling="2026-04-17 14:44:50.807474172 +0000 UTC m=+643.674859801" lastFinishedPulling="2026-04-17 14:44:53.187143952 +0000 UTC m=+646.054529588" observedRunningTime="2026-04-17 14:44:54.043534858 +0000 UTC m=+646.910920508" watchObservedRunningTime="2026-04-17 14:44:54.043704957 +0000 UTC m=+646.911090680" Apr 17 14:45:19.111924 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.111889 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5f74fd9cd6-pdgxm"] Apr 17 14:45:19.116565 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.116543 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:19.121333 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.121305 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5f74fd9cd6-pdgxm"] Apr 17 14:45:19.154205 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.154162 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvlkl\" (UniqueName: \"kubernetes.io/projected/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1-kube-api-access-xvlkl\") pod \"authorino-5f74fd9cd6-pdgxm\" (UID: \"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1\") " pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:19.255250 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.255209 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvlkl\" (UniqueName: \"kubernetes.io/projected/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1-kube-api-access-xvlkl\") pod \"authorino-5f74fd9cd6-pdgxm\" (UID: \"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1\") " pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:19.265224 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.265195 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvlkl\" (UniqueName: \"kubernetes.io/projected/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1-kube-api-access-xvlkl\") pod \"authorino-5f74fd9cd6-pdgxm\" (UID: \"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1\") " pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:19.296066 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.296025 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5f74fd9cd6-pdgxm"] Apr 17 14:45:19.296390 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.296373 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:19.438058 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:19.438010 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5f74fd9cd6-pdgxm"] Apr 17 14:45:19.442214 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:45:19.442173 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ba5e708_37e4_4f8e_81db_fe0fbbef56a1.slice/crio-e5abaed3b053aa06b2d7f11d05e4b13be54390ce438036c3039f78678c22d398 WatchSource:0}: Error finding container e5abaed3b053aa06b2d7f11d05e4b13be54390ce438036c3039f78678c22d398: Status 404 returned error can't find the container with id e5abaed3b053aa06b2d7f11d05e4b13be54390ce438036c3039f78678c22d398 Apr 17 14:45:20.133460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.133368 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" event={"ID":"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1","Type":"ContainerStarted","Data":"cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84"} Apr 17 14:45:20.133460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.133408 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" event={"ID":"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1","Type":"ContainerStarted","Data":"e5abaed3b053aa06b2d7f11d05e4b13be54390ce438036c3039f78678c22d398"} Apr 17 14:45:20.133460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.133389 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" podUID="8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" containerName="authorino" containerID="cri-o://cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84" gracePeriod=30 Apr 17 14:45:20.147654 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.147602 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" podStartSLOduration=0.807673854 podStartE2EDuration="1.147588142s" podCreationTimestamp="2026-04-17 14:45:19 +0000 UTC" firstStartedPulling="2026-04-17 14:45:19.443851433 +0000 UTC m=+672.311237062" lastFinishedPulling="2026-04-17 14:45:19.783765715 +0000 UTC m=+672.651151350" observedRunningTime="2026-04-17 14:45:20.146469791 +0000 UTC m=+673.013855455" watchObservedRunningTime="2026-04-17 14:45:20.147588142 +0000 UTC m=+673.014973793" Apr 17 14:45:20.381950 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.381923 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:20.464079 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.463983 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvlkl\" (UniqueName: \"kubernetes.io/projected/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1-kube-api-access-xvlkl\") pod \"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1\" (UID: \"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1\") " Apr 17 14:45:20.466220 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.466183 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1-kube-api-access-xvlkl" (OuterVolumeSpecName: "kube-api-access-xvlkl") pod "8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" (UID: "8ba5e708-37e4-4f8e-81db-fe0fbbef56a1"). InnerVolumeSpecName "kube-api-access-xvlkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:20.565090 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:20.565051 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xvlkl\" (UniqueName: \"kubernetes.io/projected/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1-kube-api-access-xvlkl\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:45:21.139285 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.139248 2582 generic.go:358] "Generic (PLEG): container finished" podID="8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" containerID="cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84" exitCode=0 Apr 17 14:45:21.139718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.139297 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" Apr 17 14:45:21.139718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.139333 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" event={"ID":"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1","Type":"ContainerDied","Data":"cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84"} Apr 17 14:45:21.139718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.139370 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f74fd9cd6-pdgxm" event={"ID":"8ba5e708-37e4-4f8e-81db-fe0fbbef56a1","Type":"ContainerDied","Data":"e5abaed3b053aa06b2d7f11d05e4b13be54390ce438036c3039f78678c22d398"} Apr 17 14:45:21.139718 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.139387 2582 scope.go:117] "RemoveContainer" containerID="cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84" Apr 17 14:45:21.149040 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.149006 2582 scope.go:117] "RemoveContainer" containerID="cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84" Apr 17 14:45:21.149314 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:45:21.149294 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84\": container with ID starting with cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84 not found: ID does not exist" containerID="cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84" Apr 17 14:45:21.149389 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.149327 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84"} err="failed to get container status \"cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84\": rpc error: code = NotFound desc = could not find container \"cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84\": container with ID starting with cacacafbfd9ac5c6a79306b884d7228521e2bd6026eb59a5451f86a97e422a84 not found: ID does not exist" Apr 17 14:45:21.159753 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.159721 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5f74fd9cd6-pdgxm"] Apr 17 14:45:21.164977 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.164949 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5f74fd9cd6-pdgxm"] Apr 17 14:45:21.743850 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:21.743795 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" path="/var/lib/kubelet/pods/8ba5e708-37e4-4f8e-81db-fe0fbbef56a1/volumes" Apr 17 14:45:22.486629 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:22.486594 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-qjv2j"] Apr 17 14:45:22.487073 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:22.486794 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-qjv2j" podUID="4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" containerName="authorino" containerID="cri-o://125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb" gracePeriod=30 Apr 17 14:45:22.738850 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:22.738758 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:45:22.785201 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:22.785169 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plxzn\" (UniqueName: \"kubernetes.io/projected/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a-kube-api-access-plxzn\") pod \"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a\" (UID: \"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a\") " Apr 17 14:45:22.787437 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:22.787405 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a-kube-api-access-plxzn" (OuterVolumeSpecName: "kube-api-access-plxzn") pod "4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" (UID: "4b4047c5-76cb-4b96-80f7-96c2d5af8f1a"). InnerVolumeSpecName "kube-api-access-plxzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:22.886385 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:22.886349 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-plxzn\" (UniqueName: \"kubernetes.io/projected/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a-kube-api-access-plxzn\") on node \"ip-10-0-143-171.ec2.internal\" DevicePath \"\"" Apr 17 14:45:23.150567 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.150529 2582 generic.go:358] "Generic (PLEG): container finished" podID="4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" containerID="125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb" exitCode=0 Apr 17 14:45:23.150716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.150583 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qjv2j" event={"ID":"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a","Type":"ContainerDied","Data":"125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb"} Apr 17 14:45:23.150716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.150609 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qjv2j" event={"ID":"4b4047c5-76cb-4b96-80f7-96c2d5af8f1a","Type":"ContainerDied","Data":"b77d6a67f864701e442c7c91d6308bfff582dad8c670cc992caddfde84a713cb"} Apr 17 14:45:23.150716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.150628 2582 scope.go:117] "RemoveContainer" containerID="125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb" Apr 17 14:45:23.150716 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.150586 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qjv2j" Apr 17 14:45:23.159782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.159763 2582 scope.go:117] "RemoveContainer" containerID="125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb" Apr 17 14:45:23.160083 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:45:23.160059 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb\": container with ID starting with 125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb not found: ID does not exist" containerID="125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb" Apr 17 14:45:23.160181 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.160089 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb"} err="failed to get container status \"125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb\": rpc error: code = NotFound desc = could not find container \"125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb\": container with ID starting with 125c849597fa05b3b67bc9eb48990265cf88c501f0dc86b517a9447e9560b6fb not found: ID does not exist" Apr 17 14:45:23.171029 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.170992 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-qjv2j"] Apr 17 14:45:23.176681 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.176648 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-qjv2j"] Apr 17 14:45:23.743962 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:23.743932 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" path="/var/lib/kubelet/pods/4b4047c5-76cb-4b96-80f7-96c2d5af8f1a/volumes" Apr 17 14:45:59.641110 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641035 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w"] Apr 17 14:45:59.641543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641418 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" containerName="authorino" Apr 17 14:45:59.641543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641429 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" containerName="authorino" Apr 17 14:45:59.641543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641439 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" containerName="authorino" Apr 17 14:45:59.641543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641445 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" containerName="authorino" Apr 17 14:45:59.641543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641508 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ba5e708-37e4-4f8e-81db-fe0fbbef56a1" containerName="authorino" Apr 17 14:45:59.641543 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.641519 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b4047c5-76cb-4b96-80f7-96c2d5af8f1a" containerName="authorino" Apr 17 14:45:59.643480 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.643460 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.646702 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.646675 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 14:45:59.646884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.646677 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 14:45:59.646884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.646682 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-zkz9m\"" Apr 17 14:45:59.646884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.646750 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 14:45:59.653757 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.653733 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w"] Apr 17 14:45:59.731167 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.731128 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.731352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.731210 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.731352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.731233 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42ab39a2-9f59-485a-85b3-c767544c515b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.731352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.731258 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.731352 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.731310 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.731488 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.731370 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msfc6\" (UniqueName: \"kubernetes.io/projected/42ab39a2-9f59-485a-85b3-c767544c515b-kube-api-access-msfc6\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832367 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832414 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832659 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832448 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msfc6\" (UniqueName: \"kubernetes.io/projected/42ab39a2-9f59-485a-85b3-c767544c515b-kube-api-access-msfc6\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832659 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832498 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832659 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832659 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832624 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42ab39a2-9f59-485a-85b3-c767544c515b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.832899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.832857 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.833026 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.833003 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.833086 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.833028 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.835301 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.835273 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42ab39a2-9f59-485a-85b3-c767544c515b-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.835691 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.835670 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42ab39a2-9f59-485a-85b3-c767544c515b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.839948 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.839923 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msfc6\" (UniqueName: \"kubernetes.io/projected/42ab39a2-9f59-485a-85b3-c767544c515b-kube-api-access-msfc6\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w\" (UID: \"42ab39a2-9f59-485a-85b3-c767544c515b\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:45:59.954833 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:45:59.954719 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:46:00.087084 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:00.087048 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w"] Apr 17 14:46:00.089434 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:46:00.089404 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ab39a2_9f59_485a_85b3_c767544c515b.slice/crio-afe80a7721a11aeb384032619f3957e6369f4d51f0e702a549523f913c4c181e WatchSource:0}: Error finding container afe80a7721a11aeb384032619f3957e6369f4d51f0e702a549523f913c4c181e: Status 404 returned error can't find the container with id afe80a7721a11aeb384032619f3957e6369f4d51f0e702a549523f913c4c181e Apr 17 14:46:00.091321 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:00.091299 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:46:00.309174 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:00.309134 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerStarted","Data":"afe80a7721a11aeb384032619f3957e6369f4d51f0e702a549523f913c4c181e"} Apr 17 14:46:05.038651 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.038556 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8"] Apr 17 14:46:05.056002 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.055955 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8"] Apr 17 14:46:05.056172 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.056136 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.058727 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.058700 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 14:46:05.082883 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.082839 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.083069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.082906 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.083069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.082941 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2tt7\" (UniqueName: \"kubernetes.io/projected/f75b132f-d730-40ea-9dc0-96a3c4ccc878-kube-api-access-m2tt7\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.083069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.082988 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.083069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.083022 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.083069 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.083051 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f75b132f-d730-40ea-9dc0-96a3c4ccc878-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184074 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.183956 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184074 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184067 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2tt7\" (UniqueName: \"kubernetes.io/projected/f75b132f-d730-40ea-9dc0-96a3c4ccc878-kube-api-access-m2tt7\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184134 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184177 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184223 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f75b132f-d730-40ea-9dc0-96a3c4ccc878-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184304 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184562 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184741 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184647 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.184741 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.184701 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.187142 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.187120 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f75b132f-d730-40ea-9dc0-96a3c4ccc878-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.187650 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.187624 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f75b132f-d730-40ea-9dc0-96a3c4ccc878-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.191977 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.191941 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2tt7\" (UniqueName: \"kubernetes.io/projected/f75b132f-d730-40ea-9dc0-96a3c4ccc878-kube-api-access-m2tt7\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8\" (UID: \"f75b132f-d730-40ea-9dc0-96a3c4ccc878\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:05.373859 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:05.373754 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:06.858099 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:06.858069 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8"] Apr 17 14:46:06.859929 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:46:06.859893 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf75b132f_d730_40ea_9dc0_96a3c4ccc878.slice/crio-0a853cfef3ae8a772d596bf041cdc43bd89fcf48eec58f60731519a3d552f9ff WatchSource:0}: Error finding container 0a853cfef3ae8a772d596bf041cdc43bd89fcf48eec58f60731519a3d552f9ff: Status 404 returned error can't find the container with id 0a853cfef3ae8a772d596bf041cdc43bd89fcf48eec58f60731519a3d552f9ff Apr 17 14:46:07.340863 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:07.340821 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerStarted","Data":"9a841ad46d12401d17e24b2f16050b981a5c4f9ee57cca8472bcd8588b599669"} Apr 17 14:46:07.341028 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:07.340870 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerStarted","Data":"0a853cfef3ae8a772d596bf041cdc43bd89fcf48eec58f60731519a3d552f9ff"} Apr 17 14:46:07.342452 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:07.342422 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerStarted","Data":"ef50dedd04b10104653f5c952eedee86e365df50c3b7755fd4d10bf1d183579e"} Apr 17 14:46:13.369062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:13.369022 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="9a841ad46d12401d17e24b2f16050b981a5c4f9ee57cca8472bcd8588b599669" exitCode=0 Apr 17 14:46:13.369536 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:13.369096 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"9a841ad46d12401d17e24b2f16050b981a5c4f9ee57cca8472bcd8588b599669"} Apr 17 14:46:13.370820 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:13.370767 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="ef50dedd04b10104653f5c952eedee86e365df50c3b7755fd4d10bf1d183579e" exitCode=0 Apr 17 14:46:13.370922 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:13.370826 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"ef50dedd04b10104653f5c952eedee86e365df50c3b7755fd4d10bf1d183579e"} Apr 17 14:46:15.383753 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.383715 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/0.log" Apr 17 14:46:15.384197 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.384098 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="9035875fd86edb0087e896501cb1950174f438391232b70b3b509d8b409172db" exitCode=2 Apr 17 14:46:15.384197 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.384172 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"9035875fd86edb0087e896501cb1950174f438391232b70b3b509d8b409172db"} Apr 17 14:46:15.384607 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.384585 2582 scope.go:117] "RemoveContainer" containerID="9035875fd86edb0087e896501cb1950174f438391232b70b3b509d8b409172db" Apr 17 14:46:15.385919 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.385903 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/0.log" Apr 17 14:46:15.386221 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.386193 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="d015bf0c01cd021e41a1b00623580064f6819ae73de52d243ec1edc3c6070ec7" exitCode=2 Apr 17 14:46:15.386294 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.386257 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"d015bf0c01cd021e41a1b00623580064f6819ae73de52d243ec1edc3c6070ec7"} Apr 17 14:46:15.386611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:15.386582 2582 scope.go:117] "RemoveContainer" containerID="d015bf0c01cd021e41a1b00623580064f6819ae73de52d243ec1edc3c6070ec7" Apr 17 14:46:16.391819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.391774 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/1.log" Apr 17 14:46:16.392241 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.392225 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/0.log" Apr 17 14:46:16.392548 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.392524 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="a3e210b9fcfd5920b327ee9e24f1db148dea039b533e14fd982c31ddc9bb0149" exitCode=2 Apr 17 14:46:16.392614 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.392594 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"a3e210b9fcfd5920b327ee9e24f1db148dea039b533e14fd982c31ddc9bb0149"} Apr 17 14:46:16.392659 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.392633 2582 scope.go:117] "RemoveContainer" containerID="9035875fd86edb0087e896501cb1950174f438391232b70b3b509d8b409172db" Apr 17 14:46:16.393114 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.393092 2582 scope.go:117] "RemoveContainer" containerID="a3e210b9fcfd5920b327ee9e24f1db148dea039b533e14fd982c31ddc9bb0149" Apr 17 14:46:16.393306 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:16.393285 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:46:16.402045 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.402021 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/1.log" Apr 17 14:46:16.402557 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.402537 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/0.log" Apr 17 14:46:16.402982 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.402957 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="1e7e7e54020a872093a33fc717887aeb04543b9011ed94cd8332beeb4498e613" exitCode=2 Apr 17 14:46:16.403080 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.403011 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"1e7e7e54020a872093a33fc717887aeb04543b9011ed94cd8332beeb4498e613"} Apr 17 14:46:16.403496 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.403477 2582 scope.go:117] "RemoveContainer" containerID="1e7e7e54020a872093a33fc717887aeb04543b9011ed94cd8332beeb4498e613" Apr 17 14:46:16.403725 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:16.403704 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:46:16.407041 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:16.407025 2582 scope.go:117] "RemoveContainer" containerID="d015bf0c01cd021e41a1b00623580064f6819ae73de52d243ec1edc3c6070ec7" Apr 17 14:46:17.409555 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:17.409517 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/1.log" Apr 17 14:46:17.411664 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:17.411643 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/1.log" Apr 17 14:46:19.955336 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:19.955287 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:46:19.955336 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:19.955328 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:46:19.955748 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:19.955735 2582 scope.go:117] "RemoveContainer" containerID="1e7e7e54020a872093a33fc717887aeb04543b9011ed94cd8332beeb4498e613" Apr 17 14:46:19.955961 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:19.955942 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:46:24.936726 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:24.936683 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5"] Apr 17 14:46:24.942628 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:24.942606 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:24.945230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:24.945205 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 14:46:24.949592 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:24.949565 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5"] Apr 17 14:46:25.069072 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.069027 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe291b55-da84-48e3-9dfe-e7491ab46bdf-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.069251 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.069093 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.069251 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.069120 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.069251 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.069181 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.069251 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.069206 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.069448 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.069255 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xts\" (UniqueName: \"kubernetes.io/projected/fe291b55-da84-48e3-9dfe-e7491ab46bdf-kube-api-access-d5xts\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.170422 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170379 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.170422 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170418 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.170660 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170446 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xts\" (UniqueName: \"kubernetes.io/projected/fe291b55-da84-48e3-9dfe-e7491ab46bdf-kube-api-access-d5xts\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.170660 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170523 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe291b55-da84-48e3-9dfe-e7491ab46bdf-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.170660 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170558 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.170923 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170892 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.171037 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170943 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.171037 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.170956 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.171260 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.171238 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.173050 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.173029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe291b55-da84-48e3-9dfe-e7491ab46bdf-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.173292 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.173271 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe291b55-da84-48e3-9dfe-e7491ab46bdf-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.177984 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.177962 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xts\" (UniqueName: \"kubernetes.io/projected/fe291b55-da84-48e3-9dfe-e7491ab46bdf-kube-api-access-d5xts\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5\" (UID: \"fe291b55-da84-48e3-9dfe-e7491ab46bdf\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.254591 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.254555 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:25.374273 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.374236 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:25.374435 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.374283 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:25.374829 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.374793 2582 scope.go:117] "RemoveContainer" containerID="a3e210b9fcfd5920b327ee9e24f1db148dea039b533e14fd982c31ddc9bb0149" Apr 17 14:46:25.375118 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:25.375095 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:46:25.386874 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.386837 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5"] Apr 17 14:46:25.388697 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:46:25.388671 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe291b55_da84_48e3_9dfe_e7491ab46bdf.slice/crio-7e42454e68686a9b1abcec1d0ecc06a0b3808b2fe79c16af3718794b46790db8 WatchSource:0}: Error finding container 7e42454e68686a9b1abcec1d0ecc06a0b3808b2fe79c16af3718794b46790db8: Status 404 returned error can't find the container with id 7e42454e68686a9b1abcec1d0ecc06a0b3808b2fe79c16af3718794b46790db8 Apr 17 14:46:25.452045 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:25.452008 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerStarted","Data":"7e42454e68686a9b1abcec1d0ecc06a0b3808b2fe79c16af3718794b46790db8"} Apr 17 14:46:26.457347 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:26.457309 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerStarted","Data":"29688beaaa5eb10e73723c2644168b3263ae39b1e0e18746909e28bbd02461e8"} Apr 17 14:46:31.484814 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:31.484764 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="29688beaaa5eb10e73723c2644168b3263ae39b1e0e18746909e28bbd02461e8" exitCode=0 Apr 17 14:46:31.485174 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:31.484828 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"29688beaaa5eb10e73723c2644168b3263ae39b1e0e18746909e28bbd02461e8"} Apr 17 14:46:32.490334 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:32.490304 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/0.log" Apr 17 14:46:32.490695 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:32.490655 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="b9c31f15b3c7a7efa2c7e371a60d2c9d5837167c2b1093137c779eaed2899515" exitCode=2 Apr 17 14:46:32.490766 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:32.490701 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"b9c31f15b3c7a7efa2c7e371a60d2c9d5837167c2b1093137c779eaed2899515"} Apr 17 14:46:32.491121 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:32.491104 2582 scope.go:117] "RemoveContainer" containerID="b9c31f15b3c7a7efa2c7e371a60d2c9d5837167c2b1093137c779eaed2899515" Apr 17 14:46:33.496111 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:33.496075 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/1.log" Apr 17 14:46:33.496491 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:33.496420 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/0.log" Apr 17 14:46:33.496770 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:33.496746 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="7f498d151f9eb18d7d0b1e8af35864fc42bd58d0e962f27db0a08e407d26b30a" exitCode=2 Apr 17 14:46:33.496863 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:33.496835 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"7f498d151f9eb18d7d0b1e8af35864fc42bd58d0e962f27db0a08e407d26b30a"} Apr 17 14:46:33.496908 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:33.496878 2582 scope.go:117] "RemoveContainer" containerID="b9c31f15b3c7a7efa2c7e371a60d2c9d5837167c2b1093137c779eaed2899515" Apr 17 14:46:33.497426 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:33.497396 2582 scope.go:117] "RemoveContainer" containerID="7f498d151f9eb18d7d0b1e8af35864fc42bd58d0e962f27db0a08e407d26b30a" Apr 17 14:46:33.497670 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:33.497648 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:46:34.242701 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.242665 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj"] Apr 17 14:46:34.247560 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.247541 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.249820 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.249778 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 14:46:34.254349 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.254326 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj"] Apr 17 14:46:34.354978 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.354928 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.354978 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.354978 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvsb\" (UniqueName: \"kubernetes.io/projected/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-kube-api-access-qqvsb\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.355206 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.355018 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.355206 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.355043 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.355206 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.355074 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.355206 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.355113 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456306 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456267 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456306 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456304 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456532 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456324 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456532 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456350 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456532 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456458 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456532 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456485 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvsb\" (UniqueName: \"kubernetes.io/projected/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-kube-api-access-qqvsb\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456758 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456737 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-model-cache\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456843 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456821 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-home\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.456898 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.456866 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.458941 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.458919 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-dshm\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.459314 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.459293 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-tls-certs\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.464023 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.464001 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvsb\" (UniqueName: \"kubernetes.io/projected/f9a2b25f-f431-45a4-970d-44f3af0f7ec5-kube-api-access-qqvsb\") pod \"e2e-distinct-simulated-kserve-8485d77cdf-cjhgj\" (UID: \"f9a2b25f-f431-45a4-970d-44f3af0f7ec5\") " pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.502339 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.502306 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/1.log" Apr 17 14:46:34.560299 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.560261 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:34.702084 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.702052 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj"] Apr 17 14:46:34.703428 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:46:34.703404 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a2b25f_f431_45a4_970d_44f3af0f7ec5.slice/crio-bac36d02e7d1d2165be0490206a3aed7f9b0893fd73dcba7b2136f643f958ab2 WatchSource:0}: Error finding container bac36d02e7d1d2165be0490206a3aed7f9b0893fd73dcba7b2136f643f958ab2: Status 404 returned error can't find the container with id bac36d02e7d1d2165be0490206a3aed7f9b0893fd73dcba7b2136f643f958ab2 Apr 17 14:46:34.738920 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:34.738889 2582 scope.go:117] "RemoveContainer" containerID="1e7e7e54020a872093a33fc717887aeb04543b9011ed94cd8332beeb4498e613" Apr 17 14:46:35.255501 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.255454 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:35.255501 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.255501 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:35.256083 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.256063 2582 scope.go:117] "RemoveContainer" containerID="7f498d151f9eb18d7d0b1e8af35864fc42bd58d0e962f27db0a08e407d26b30a" Apr 17 14:46:35.256339 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:35.256317 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:46:35.508940 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.508885 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerStarted","Data":"e4a57f3b8f4b5bdb11adc3fc7687317d3efa146cbb45e111736ec070d5b7d4b4"} Apr 17 14:46:35.508940 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.508932 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerStarted","Data":"bac36d02e7d1d2165be0490206a3aed7f9b0893fd73dcba7b2136f643f958ab2"} Apr 17 14:46:35.510586 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.510566 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/2.log" Apr 17 14:46:35.511016 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.510995 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/1.log" Apr 17 14:46:35.511375 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.511348 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e" exitCode=2 Apr 17 14:46:35.511422 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.511392 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e"} Apr 17 14:46:35.511460 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.511431 2582 scope.go:117] "RemoveContainer" containerID="1e7e7e54020a872093a33fc717887aeb04543b9011ed94cd8332beeb4498e613" Apr 17 14:46:35.511920 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:35.511900 2582 scope.go:117] "RemoveContainer" containerID="3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e" Apr 17 14:46:35.512153 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:35.512125 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:46:36.516895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:36.516844 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/2.log" Apr 17 14:46:39.738610 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:39.738570 2582 scope.go:117] "RemoveContainer" containerID="a3e210b9fcfd5920b327ee9e24f1db148dea039b533e14fd982c31ddc9bb0149" Apr 17 14:46:39.955331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:39.955287 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:46:39.955331 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:39.955333 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:46:39.955834 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:39.955786 2582 scope.go:117] "RemoveContainer" containerID="3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e" Apr 17 14:46:39.956052 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:39.956035 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:46:40.535673 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.535637 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="e4a57f3b8f4b5bdb11adc3fc7687317d3efa146cbb45e111736ec070d5b7d4b4" exitCode=0 Apr 17 14:46:40.535895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.535713 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"e4a57f3b8f4b5bdb11adc3fc7687317d3efa146cbb45e111736ec070d5b7d4b4"} Apr 17 14:46:40.537374 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.537354 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/2.log" Apr 17 14:46:40.537854 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.537722 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/1.log" Apr 17 14:46:40.538115 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.538094 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9" exitCode=2 Apr 17 14:46:40.538191 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.538127 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9"} Apr 17 14:46:40.538191 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.538158 2582 scope.go:117] "RemoveContainer" containerID="a3e210b9fcfd5920b327ee9e24f1db148dea039b533e14fd982c31ddc9bb0149" Apr 17 14:46:40.538686 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:40.538667 2582 scope.go:117] "RemoveContainer" containerID="7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9" Apr 17 14:46:40.538942 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:40.538912 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:46:41.543916 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:41.543889 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/0.log" Apr 17 14:46:41.544364 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:41.544230 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="a3649d2a5a3d3c901ec1f416d7c44f053966ea8beab26512c102cc77e2cc905a" exitCode=2 Apr 17 14:46:41.544364 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:41.544303 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"a3649d2a5a3d3c901ec1f416d7c44f053966ea8beab26512c102cc77e2cc905a"} Apr 17 14:46:41.544707 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:41.544689 2582 scope.go:117] "RemoveContainer" containerID="a3649d2a5a3d3c901ec1f416d7c44f053966ea8beab26512c102cc77e2cc905a" Apr 17 14:46:41.545725 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:41.545707 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/2.log" Apr 17 14:46:42.552623 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:42.552593 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/1.log" Apr 17 14:46:42.553119 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:42.553049 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/0.log" Apr 17 14:46:42.553405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:42.553383 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="02eef9ca022a59a0dbb156d74171164dfe63433b6160146934323d9a9bd99355" exitCode=2 Apr 17 14:46:42.553468 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:42.553448 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"02eef9ca022a59a0dbb156d74171164dfe63433b6160146934323d9a9bd99355"} Apr 17 14:46:42.553524 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:42.553492 2582 scope.go:117] "RemoveContainer" containerID="a3649d2a5a3d3c901ec1f416d7c44f053966ea8beab26512c102cc77e2cc905a" Apr 17 14:46:42.554004 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:42.553981 2582 scope.go:117] "RemoveContainer" containerID="02eef9ca022a59a0dbb156d74171164dfe63433b6160146934323d9a9bd99355" Apr 17 14:46:42.554262 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:42.554242 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:46:43.559004 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:43.558972 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/1.log" Apr 17 14:46:44.560434 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:44.560392 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:44.560434 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:44.560438 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:46:44.561080 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:44.561060 2582 scope.go:117] "RemoveContainer" containerID="02eef9ca022a59a0dbb156d74171164dfe63433b6160146934323d9a9bd99355" Apr 17 14:46:44.561354 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:44.561324 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:46:45.374326 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:45.374289 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:45.374326 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:45.374332 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:46:45.374885 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:45.374865 2582 scope.go:117] "RemoveContainer" containerID="7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9" Apr 17 14:46:45.375104 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:45.375086 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:46:48.740014 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:48.739974 2582 scope.go:117] "RemoveContainer" containerID="7f498d151f9eb18d7d0b1e8af35864fc42bd58d0e962f27db0a08e407d26b30a" Apr 17 14:46:49.587215 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:49.587183 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/2.log" Apr 17 14:46:49.587577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:49.587559 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/1.log" Apr 17 14:46:49.587905 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:49.587884 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70" exitCode=2 Apr 17 14:46:49.587992 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:49.587968 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70"} Apr 17 14:46:49.588051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:49.588019 2582 scope.go:117] "RemoveContainer" containerID="7f498d151f9eb18d7d0b1e8af35864fc42bd58d0e962f27db0a08e407d26b30a" Apr 17 14:46:49.588412 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:49.588395 2582 scope.go:117] "RemoveContainer" containerID="8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70" Apr 17 14:46:49.588644 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:49.588618 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:46:50.594388 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:50.594362 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/2.log" Apr 17 14:46:51.739459 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:51.739427 2582 scope.go:117] "RemoveContainer" containerID="3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e" Apr 17 14:46:51.739876 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:51.739621 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:46:55.254910 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:55.254870 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:55.254910 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:55.254916 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:46:55.255372 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:55.255347 2582 scope.go:117] "RemoveContainer" containerID="8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70" Apr 17 14:46:55.255548 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:55.255530 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:46:56.437481 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.437448 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7"] Apr 17 14:46:56.442231 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.442202 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.444654 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.444628 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 14:46:56.452414 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.452385 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7"] Apr 17 14:46:56.538230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.538168 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e084cc-4e34-4e43-a5cc-8c48e232529a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.538428 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.538255 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4wj\" (UniqueName: \"kubernetes.io/projected/d4e084cc-4e34-4e43-a5cc-8c48e232529a-kube-api-access-4l4wj\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.538428 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.538301 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.538428 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.538342 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.538428 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.538367 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.538578 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.538456 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.639727 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.639681 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.639954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.639779 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.639954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.639847 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e084cc-4e34-4e43-a5cc-8c48e232529a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.639954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.639881 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4wj\" (UniqueName: \"kubernetes.io/projected/d4e084cc-4e34-4e43-a5cc-8c48e232529a-kube-api-access-4l4wj\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.639954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.639937 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.640185 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.639974 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.640248 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.640210 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.640317 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.640296 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.640384 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.640361 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.642145 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.642116 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d4e084cc-4e34-4e43-a5cc-8c48e232529a-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.642463 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.642444 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e084cc-4e34-4e43-a5cc-8c48e232529a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.647122 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.647097 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4wj\" (UniqueName: \"kubernetes.io/projected/d4e084cc-4e34-4e43-a5cc-8c48e232529a-kube-api-access-4l4wj\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-69mx7\" (UID: \"d4e084cc-4e34-4e43-a5cc-8c48e232529a\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.739409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.739322 2582 scope.go:117] "RemoveContainer" containerID="7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9" Apr 17 14:46:56.739557 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:56.739530 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:46:56.752473 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.752425 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:46:56.887273 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:56.887239 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7"] Apr 17 14:46:56.888623 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:46:56.888586 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e084cc_4e34_4e43_a5cc_8c48e232529a.slice/crio-eae94bb3d75c2973603740932ac2425d9a7780c2121be0d318bfb31a110402d2 WatchSource:0}: Error finding container eae94bb3d75c2973603740932ac2425d9a7780c2121be0d318bfb31a110402d2: Status 404 returned error can't find the container with id eae94bb3d75c2973603740932ac2425d9a7780c2121be0d318bfb31a110402d2 Apr 17 14:46:57.630425 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:57.630375 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerStarted","Data":"8f3e6ff33bc6571f2140de2aa339081cdff5d8705950bf23ee496a05d2bfae50"} Apr 17 14:46:57.630425 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:57.630420 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerStarted","Data":"eae94bb3d75c2973603740932ac2425d9a7780c2121be0d318bfb31a110402d2"} Apr 17 14:46:58.738836 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:58.738789 2582 scope.go:117] "RemoveContainer" containerID="02eef9ca022a59a0dbb156d74171164dfe63433b6160146934323d9a9bd99355" Apr 17 14:46:59.641538 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:59.641509 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/2.log" Apr 17 14:46:59.641850 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:59.641835 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/1.log" Apr 17 14:46:59.642203 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:59.642176 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29" exitCode=2 Apr 17 14:46:59.642283 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:59.642247 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29"} Apr 17 14:46:59.642332 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:59.642292 2582 scope.go:117] "RemoveContainer" containerID="02eef9ca022a59a0dbb156d74171164dfe63433b6160146934323d9a9bd99355" Apr 17 14:46:59.642699 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:46:59.642683 2582 scope.go:117] "RemoveContainer" containerID="be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29" Apr 17 14:46:59.642946 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:46:59.642918 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:47:00.648492 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:00.648465 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/2.log" Apr 17 14:47:04.560390 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:04.560351 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:47:04.560390 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:04.560391 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:47:04.560934 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:04.560915 2582 scope.go:117] "RemoveContainer" containerID="be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29" Apr 17 14:47:04.561169 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:04.561150 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:47:04.739266 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:04.739228 2582 scope.go:117] "RemoveContainer" containerID="3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e" Apr 17 14:47:05.673062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.673017 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="8f3e6ff33bc6571f2140de2aa339081cdff5d8705950bf23ee496a05d2bfae50" exitCode=0 Apr 17 14:47:05.673522 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.673090 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"8f3e6ff33bc6571f2140de2aa339081cdff5d8705950bf23ee496a05d2bfae50"} Apr 17 14:47:05.675168 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.675142 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/3.log" Apr 17 14:47:05.675612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.675591 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/2.log" Apr 17 14:47:05.676006 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.675976 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" exitCode=2 Apr 17 14:47:05.676121 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.676049 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e"} Apr 17 14:47:05.676121 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.676088 2582 scope.go:117] "RemoveContainer" containerID="3daf73cf03b07a261c1ec349d6aa875e46da5b88c66cd99e0d5c4fb8864d330e" Apr 17 14:47:05.676610 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:05.676590 2582 scope.go:117] "RemoveContainer" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" Apr 17 14:47:05.676862 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:05.676839 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:47:06.681869 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.681835 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/0.log" Apr 17 14:47:06.682290 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.682166 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="34c8ac1f17513494ca2f37802c39c01ddb6e0e36c611e7d1871b5755bfd542ee" exitCode=2 Apr 17 14:47:06.682290 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.682252 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"34c8ac1f17513494ca2f37802c39c01ddb6e0e36c611e7d1871b5755bfd542ee"} Apr 17 14:47:06.682695 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.682668 2582 scope.go:117] "RemoveContainer" containerID="34c8ac1f17513494ca2f37802c39c01ddb6e0e36c611e7d1871b5755bfd542ee" Apr 17 14:47:06.683908 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.683892 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/3.log" Apr 17 14:47:06.752819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.752776 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:06.752819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:06.752825 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:07.691057 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:07.691025 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/1.log" Apr 17 14:47:07.691530 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:07.691448 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/0.log" Apr 17 14:47:07.691826 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:07.691784 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe" exitCode=2 Apr 17 14:47:07.691895 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:07.691876 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe"} Apr 17 14:47:07.691935 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:07.691919 2582 scope.go:117] "RemoveContainer" containerID="34c8ac1f17513494ca2f37802c39c01ddb6e0e36c611e7d1871b5755bfd542ee" Apr 17 14:47:07.692243 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:07.692227 2582 scope.go:117] "RemoveContainer" containerID="32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe" Apr 17 14:47:07.692465 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:07.692448 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:08.697784 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:08.697757 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/1.log" Apr 17 14:47:08.698577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:08.698557 2582 scope.go:117] "RemoveContainer" containerID="32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe" Apr 17 14:47:08.698851 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:08.698785 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:08.738409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:08.738370 2582 scope.go:117] "RemoveContainer" containerID="7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9" Apr 17 14:47:08.738611 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:08.738462 2582 scope.go:117] "RemoveContainer" containerID="8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70" Apr 17 14:47:08.738691 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:08.738668 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:47:09.704445 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.704414 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/3.log" Apr 17 14:47:09.704883 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.704871 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/2.log" Apr 17 14:47:09.705240 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.705215 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" exitCode=2 Apr 17 14:47:09.705363 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.705293 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d"} Apr 17 14:47:09.705363 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.705346 2582 scope.go:117] "RemoveContainer" containerID="7fd0478d4e028984aa7913a63779f939bfd1a32cb5a47131be61ea4b9ba844d9" Apr 17 14:47:09.705869 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.705852 2582 scope.go:117] "RemoveContainer" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" Apr 17 14:47:09.706108 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:09.706086 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:47:09.955080 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.954970 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:47:09.955080 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.955025 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:47:09.955467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:09.955454 2582 scope.go:117] "RemoveContainer" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" Apr 17 14:47:09.955669 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:09.955654 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:47:10.710790 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:10.710755 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/3.log" Apr 17 14:47:15.374784 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:15.374739 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:47:15.375204 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:15.374829 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:47:15.375276 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:15.375261 2582 scope.go:117] "RemoveContainer" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" Apr 17 14:47:15.375479 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:15.375460 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:47:16.752787 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:16.752745 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:16.753206 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:16.752838 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:16.753287 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:16.753272 2582 scope.go:117] "RemoveContainer" containerID="32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe" Apr 17 14:47:16.753480 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:16.753462 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:17.741533 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:17.741503 2582 scope.go:117] "RemoveContainer" containerID="32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe" Apr 17 14:47:18.739036 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.739007 2582 scope.go:117] "RemoveContainer" containerID="be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29" Apr 17 14:47:18.739420 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:18.739159 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:47:18.745216 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.745186 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/2.log" Apr 17 14:47:18.745567 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.745550 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/1.log" Apr 17 14:47:18.745938 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.745915 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="442e55e7c0ec2b8e34e57e5916e0ace660ab0265f8f77302b8c324985b0b860f" exitCode=2 Apr 17 14:47:18.746009 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.745987 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"442e55e7c0ec2b8e34e57e5916e0ace660ab0265f8f77302b8c324985b0b860f"} Apr 17 14:47:18.746050 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.746036 2582 scope.go:117] "RemoveContainer" containerID="32bb632d035ee6d1183ea245a9646c06a1c61abe5344aa08e812dad75bc9ccfe" Apr 17 14:47:18.746511 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:18.746492 2582 scope.go:117] "RemoveContainer" containerID="442e55e7c0ec2b8e34e57e5916e0ace660ab0265f8f77302b8c324985b0b860f" Apr 17 14:47:18.746736 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:18.746717 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:19.743772 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:19.743745 2582 scope.go:117] "RemoveContainer" containerID="8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70" Apr 17 14:47:19.751736 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:19.751708 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/2.log" Apr 17 14:47:20.758482 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:20.758453 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/3.log" Apr 17 14:47:20.758932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:20.758771 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/2.log" Apr 17 14:47:20.759159 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:20.759137 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" exitCode=2 Apr 17 14:47:20.759230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:20.759212 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839"} Apr 17 14:47:20.759268 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:20.759253 2582 scope.go:117] "RemoveContainer" containerID="8f9f12ba0c603a909193dc89839f01599e1edc4fd1b5176fb9e08359363dbf70" Apr 17 14:47:20.759696 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:20.759678 2582 scope.go:117] "RemoveContainer" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" Apr 17 14:47:20.759935 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:20.759916 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:47:21.172933 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.172846 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l"] Apr 17 14:47:21.210386 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.210343 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l"] Apr 17 14:47:21.210559 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.210504 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.212819 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.212782 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 14:47:21.277497 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.277441 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.277497 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.277497 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt42f\" (UniqueName: \"kubernetes.io/projected/86f00b01-b2bf-4097-8c2e-85946b07e499-kube-api-access-vt42f\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.277782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.277595 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.277782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.277643 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86f00b01-b2bf-4097-8c2e-85946b07e499-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.277782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.277727 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.277782 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.277765 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378464 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378424 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378464 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378473 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378707 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378496 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt42f\" (UniqueName: \"kubernetes.io/projected/86f00b01-b2bf-4097-8c2e-85946b07e499-kube-api-access-vt42f\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378707 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378532 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378707 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378587 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86f00b01-b2bf-4097-8c2e-85946b07e499-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378707 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378680 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378927 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378891 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378982 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378950 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.378982 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.378973 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.381103 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.381076 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86f00b01-b2bf-4097-8c2e-85946b07e499-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.381325 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.381303 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86f00b01-b2bf-4097-8c2e-85946b07e499-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.390848 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.390791 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt42f\" (UniqueName: \"kubernetes.io/projected/86f00b01-b2bf-4097-8c2e-85946b07e499-kube-api-access-vt42f\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zhl2l\" (UID: \"86f00b01-b2bf-4097-8c2e-85946b07e499\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.521793 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.521750 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:21.659659 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.659624 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l"] Apr 17 14:47:21.660639 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:47:21.660606 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f00b01_b2bf_4097_8c2e_85946b07e499.slice/crio-674ca35730c327c0db54930637d002ce39ee1ed7dfed3de4f36d39c14660eeb6 WatchSource:0}: Error finding container 674ca35730c327c0db54930637d002ce39ee1ed7dfed3de4f36d39c14660eeb6: Status 404 returned error can't find the container with id 674ca35730c327c0db54930637d002ce39ee1ed7dfed3de4f36d39c14660eeb6 Apr 17 14:47:21.770085 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.770038 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" event={"ID":"86f00b01-b2bf-4097-8c2e-85946b07e499","Type":"ContainerStarted","Data":"b8af82eacd5e4741c82a047fbac0d4ba6e72bdf2bd9b1433e1f69bf033c5515d"} Apr 17 14:47:21.770085 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.770094 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" event={"ID":"86f00b01-b2bf-4097-8c2e-85946b07e499","Type":"ContainerStarted","Data":"674ca35730c327c0db54930637d002ce39ee1ed7dfed3de4f36d39c14660eeb6"} Apr 17 14:47:21.771710 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:21.771690 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/3.log" Apr 17 14:47:25.255514 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:25.255421 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:47:25.255514 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:25.255473 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:47:25.256074 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:25.255978 2582 scope.go:117] "RemoveContainer" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" Apr 17 14:47:25.256195 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:25.256175 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:47:25.738693 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:25.738594 2582 scope.go:117] "RemoveContainer" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" Apr 17 14:47:25.738926 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:25.738902 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:47:26.753097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:26.753053 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:26.753097 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:26.753106 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:26.753577 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:26.753562 2582 scope.go:117] "RemoveContainer" containerID="442e55e7c0ec2b8e34e57e5916e0ace660ab0265f8f77302b8c324985b0b860f" Apr 17 14:47:26.753764 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:26.753746 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:27.801041 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:27.801006 2582 generic.go:358] "Generic (PLEG): container finished" podID="86f00b01-b2bf-4097-8c2e-85946b07e499" containerID="b8af82eacd5e4741c82a047fbac0d4ba6e72bdf2bd9b1433e1f69bf033c5515d" exitCode=0 Apr 17 14:47:27.801414 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:27.801078 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" event={"ID":"86f00b01-b2bf-4097-8c2e-85946b07e499","Type":"ContainerDied","Data":"b8af82eacd5e4741c82a047fbac0d4ba6e72bdf2bd9b1433e1f69bf033c5515d"} Apr 17 14:47:29.738752 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:29.738716 2582 scope.go:117] "RemoveContainer" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" Apr 17 14:47:29.739255 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:29.739005 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:47:31.823230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:31.823188 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" event={"ID":"86f00b01-b2bf-4097-8c2e-85946b07e499","Type":"ContainerStarted","Data":"c6c16a5bf3c2fe3213fd412f8b9e314fe1a32cdd80dbfcf5520e1667ece21fea"} Apr 17 14:47:31.823619 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:31.823396 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:31.843197 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:31.843137 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" podStartSLOduration=6.977900608 podStartE2EDuration="10.843119803s" podCreationTimestamp="2026-04-17 14:47:21 +0000 UTC" firstStartedPulling="2026-04-17 14:47:27.801843114 +0000 UTC m=+800.669228742" lastFinishedPulling="2026-04-17 14:47:31.667062291 +0000 UTC m=+804.534447937" observedRunningTime="2026-04-17 14:47:31.839642467 +0000 UTC m=+804.707028209" watchObservedRunningTime="2026-04-17 14:47:31.843119803 +0000 UTC m=+804.710505454" Apr 17 14:47:33.738706 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:33.738666 2582 scope.go:117] "RemoveContainer" containerID="be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29" Apr 17 14:47:34.840612 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:34.840580 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/3.log" Apr 17 14:47:34.841056 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:34.840966 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/2.log" Apr 17 14:47:34.841310 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:34.841289 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" exitCode=2 Apr 17 14:47:34.841381 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:34.841362 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4"} Apr 17 14:47:34.841428 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:34.841405 2582 scope.go:117] "RemoveContainer" containerID="be8e22aee68095c4ac113e5b0f2e5d185e4142149f443a6f7bb057df1d2e6e29" Apr 17 14:47:34.841925 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:34.841901 2582 scope.go:117] "RemoveContainer" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" Apr 17 14:47:34.842182 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:34.842164 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:47:35.846857 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:35.846829 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/3.log" Apr 17 14:47:37.742634 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:37.742599 2582 scope.go:117] "RemoveContainer" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" Apr 17 14:47:37.743134 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:37.742842 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:47:40.738785 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:40.738751 2582 scope.go:117] "RemoveContainer" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" Apr 17 14:47:40.739202 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:40.738999 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:47:41.739607 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:41.739564 2582 scope.go:117] "RemoveContainer" containerID="442e55e7c0ec2b8e34e57e5916e0ace660ab0265f8f77302b8c324985b0b860f" Apr 17 14:47:42.738742 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.738705 2582 scope.go:117] "RemoveContainer" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" Apr 17 14:47:42.738998 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:42.738968 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:47:42.841046 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.841012 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zhl2l" Apr 17 14:47:42.877472 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.877429 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/3.log" Apr 17 14:47:42.877913 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.877892 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/2.log" Apr 17 14:47:42.878294 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.878265 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" exitCode=2 Apr 17 14:47:42.878419 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.878302 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7"} Apr 17 14:47:42.878419 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.878347 2582 scope.go:117] "RemoveContainer" containerID="442e55e7c0ec2b8e34e57e5916e0ace660ab0265f8f77302b8c324985b0b860f" Apr 17 14:47:42.878791 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:42.878775 2582 scope.go:117] "RemoveContainer" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" Apr 17 14:47:42.879150 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:42.879108 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:43.883969 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:43.883940 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/3.log" Apr 17 14:47:44.560790 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:44.560749 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:47:44.560790 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:44.560793 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:47:44.561427 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:44.561408 2582 scope.go:117] "RemoveContainer" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" Apr 17 14:47:44.561645 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:44.561626 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:47:46.752839 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:46.752748 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:46.752839 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:46.752796 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:47:46.753294 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:46.753284 2582 scope.go:117] "RemoveContainer" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" Apr 17 14:47:46.753492 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:46.753471 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:51.739042 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:51.739005 2582 scope.go:117] "RemoveContainer" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" Apr 17 14:47:52.924901 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:52.924868 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/4.log" Apr 17 14:47:52.925569 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:52.925543 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/3.log" Apr 17 14:47:52.925988 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:52.925960 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" exitCode=2 Apr 17 14:47:52.926121 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:52.926025 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12"} Apr 17 14:47:52.926121 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:52.926078 2582 scope.go:117] "RemoveContainer" containerID="eaa2b90f5067f49270e20f629bc55aba3f20c5832a9e6df5cf88be8ee7301f8e" Apr 17 14:47:52.926551 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:52.926532 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:47:52.926844 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:52.926785 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:47:53.932273 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:53.932247 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/4.log" Apr 17 14:47:55.739268 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:55.739235 2582 scope.go:117] "RemoveContainer" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" Apr 17 14:47:55.739672 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:55.739421 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:47:56.738480 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:56.738443 2582 scope.go:117] "RemoveContainer" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" Apr 17 14:47:57.951338 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:57.951312 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/4.log" Apr 17 14:47:57.951749 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:57.951678 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/3.log" Apr 17 14:47:57.952045 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:57.952023 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" exitCode=2 Apr 17 14:47:57.952118 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:57.952091 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5"} Apr 17 14:47:57.952161 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:57.952135 2582 scope.go:117] "RemoveContainer" containerID="4aaecdf6bb9b743b6f0752e1d8495b1b132721653cd02a885481706ffd83405d" Apr 17 14:47:57.952592 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:57.952575 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:47:57.952849 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:57.952826 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:47:58.957787 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:58.957756 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/4.log" Apr 17 14:47:59.739223 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:59.739185 2582 scope.go:117] "RemoveContainer" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" Apr 17 14:47:59.739427 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:59.739250 2582 scope.go:117] "RemoveContainer" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" Apr 17 14:47:59.739504 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:59.739443 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:47:59.739504 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:59.739448 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:47:59.955013 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:59.954972 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:47:59.955013 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:59.955018 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:47:59.955492 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:47:59.955462 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:47:59.955674 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:47:59.955655 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:48:05.374154 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:05.374113 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:48:05.374154 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:05.374153 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:48:05.374638 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:05.374619 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:48:05.374865 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:05.374847 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:48:07.742434 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:07.742403 2582 scope.go:117] "RemoveContainer" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" Apr 17 14:48:09.001440 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:09.001405 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/4.log" Apr 17 14:48:09.001849 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:09.001819 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/3.log" Apr 17 14:48:09.002183 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:09.002160 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" exitCode=2 Apr 17 14:48:09.002258 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:09.002239 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720"} Apr 17 14:48:09.002296 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:09.002282 2582 scope.go:117] "RemoveContainer" containerID="7f9e1bde48e676c18a0f1a73cadb8ab22faa6423619d80536add0bd69cad5839" Apr 17 14:48:09.002830 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:09.002782 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:48:09.003047 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:09.003027 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:48:10.008480 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:10.008451 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/4.log" Apr 17 14:48:12.738575 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:12.738538 2582 scope.go:117] "RemoveContainer" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" Apr 17 14:48:12.739029 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:12.738753 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:48:13.739723 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:13.739684 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:48:13.744990 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:13.744947 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:48:14.738495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:14.738460 2582 scope.go:117] "RemoveContainer" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" Apr 17 14:48:14.738668 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:14.738640 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:48:15.255039 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:15.254987 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:48:15.255039 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:15.255027 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:48:15.255487 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:15.255458 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:48:15.255672 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:15.255655 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:48:20.738527 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:20.738491 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:48:20.738937 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:20.738679 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:48:23.739309 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:23.739270 2582 scope.go:117] "RemoveContainer" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" Apr 17 14:48:24.069267 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.069227 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/4.log" Apr 17 14:48:24.069703 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.069685 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/3.log" Apr 17 14:48:24.070116 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.070085 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" exitCode=2 Apr 17 14:48:24.070234 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.070146 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862"} Apr 17 14:48:24.070234 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.070206 2582 scope.go:117] "RemoveContainer" containerID="012ae0f83bf4ea80e73a246861136ac1985adf250c5e3ee36b5344a7bfb264d4" Apr 17 14:48:24.070688 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.070666 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:48:24.070924 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:24.070901 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:48:24.560495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.560461 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:48:24.560495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.560499 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:48:24.738587 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:24.738554 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:48:24.738840 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:24.738786 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:48:25.075241 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:25.075215 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/4.log" Apr 17 14:48:25.076067 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:25.076044 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:48:25.076290 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:25.076270 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:48:25.738654 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:25.738624 2582 scope.go:117] "RemoveContainer" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" Apr 17 14:48:26.081111 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.081075 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/4.log" Apr 17 14:48:26.081555 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.081533 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/3.log" Apr 17 14:48:26.081932 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.081909 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" exitCode=2 Apr 17 14:48:26.082004 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.081978 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f"} Apr 17 14:48:26.082051 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.082031 2582 scope.go:117] "RemoveContainer" containerID="d8ac652e9f8fdfb825bf73dd1c8862324c37c8b135a178722fe30f0a573d01a7" Apr 17 14:48:26.082467 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.082442 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:48:26.082763 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:26.082674 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:48:26.739311 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.739271 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:48:26.739504 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:26.739483 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:48:26.752945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.752911 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:48:26.752945 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:26.752948 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:48:27.087357 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:27.087320 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/4.log" Apr 17 14:48:27.088077 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:27.088060 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:48:27.088275 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:27.088257 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:48:32.738880 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:32.738837 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:48:32.739336 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:32.739055 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:48:35.738899 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:35.738865 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:48:35.739407 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:35.739123 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:48:37.742367 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:37.742335 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:48:37.742863 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:37.742549 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:48:38.738539 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:38.738505 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:48:38.738734 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:38.738698 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:48:41.739391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:41.739350 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:48:41.739774 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:41.739640 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:48:45.739001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:45.738964 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:48:45.739385 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:45.739197 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:48:49.738517 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:49.738479 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:48:49.738973 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:49.738683 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:48:52.738510 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:52.738476 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:48:52.738921 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:52.738694 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:48:53.738733 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:53.738688 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:48:53.739128 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:53.738763 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:48:53.739128 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:53.738965 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:48:53.739128 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:53.738991 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:48:59.739036 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:48:59.738948 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:48:59.739420 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:48:59.739151 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:49:02.739305 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:02.739275 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:49:02.739679 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:02.739480 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:49:05.738828 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:05.738754 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:49:05.739299 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:05.738903 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:49:05.739299 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:05.738992 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:49:05.739299 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:05.739107 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:49:06.738726 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:06.738689 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:49:06.738936 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:06.738914 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:49:07.670569 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.670533 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/4.log" Apr 17 14:49:07.671275 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.671249 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/4.log" Apr 17 14:49:07.671982 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.671959 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/4.log" Apr 17 14:49:07.672583 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.672563 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/4.log" Apr 17 14:49:07.672760 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.672743 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/4.log" Apr 17 14:49:07.673231 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.673216 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/4.log" Apr 17 14:49:07.673405 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.673388 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/4.log" Apr 17 14:49:07.674065 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.674046 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/4.log" Apr 17 14:49:07.674655 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.674639 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/4.log" Apr 17 14:49:07.675313 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.675297 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/4.log" Apr 17 14:49:07.691916 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.691884 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:49:07.693955 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:07.693917 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:49:10.738738 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:10.738698 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:49:10.739160 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:10.738942 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:49:16.739409 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:16.739372 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:49:17.314525 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:17.314489 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/5.log" Apr 17 14:49:17.314939 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:17.314919 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/4.log" Apr 17 14:49:17.315266 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:17.315239 2582 generic.go:358] "Generic (PLEG): container finished" podID="42ab39a2-9f59-485a-85b3-c767544c515b" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" exitCode=2 Apr 17 14:49:17.315327 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:17.315279 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" event={"ID":"42ab39a2-9f59-485a-85b3-c767544c515b","Type":"ContainerDied","Data":"0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367"} Apr 17 14:49:17.315369 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:17.315327 2582 scope.go:117] "RemoveContainer" containerID="f0b24d54d15b3b1b45345ce1e2e2454fd7dd637f1bcabe9a35f5192a93f26b12" Apr 17 14:49:17.315818 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:17.315785 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:49:17.316072 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:17.316054 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:49:18.321982 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:18.321951 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/5.log" Apr 17 14:49:18.739106 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:18.739007 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:49:18.739261 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:18.739124 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:49:18.739311 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:18.739284 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:49:18.739362 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:18.739326 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:49:19.738986 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:19.738951 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:49:19.739383 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:19.739134 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:49:19.955599 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:19.955542 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:49:19.955599 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:19.955603 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" Apr 17 14:49:19.956118 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:19.956098 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:49:19.956365 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:19.956330 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:49:24.738739 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:24.738701 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:49:25.353238 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.353153 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/5.log" Apr 17 14:49:25.353583 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.353566 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/4.log" Apr 17 14:49:25.353944 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.353922 2582 generic.go:358] "Generic (PLEG): container finished" podID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" exitCode=2 Apr 17 14:49:25.354014 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.353995 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" event={"ID":"f75b132f-d730-40ea-9dc0-96a3c4ccc878","Type":"ContainerDied","Data":"ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7"} Apr 17 14:49:25.354056 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.354038 2582 scope.go:117] "RemoveContainer" containerID="56fe0e9366b7761692ff1b6f9ff4c6755f29353bf6fd8a5e8c5de4d23f38a3d5" Apr 17 14:49:25.354551 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.354517 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:49:25.354793 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:25.354769 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:49:25.374762 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.374725 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:49:25.374762 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:25.374764 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" Apr 17 14:49:26.360092 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:26.360059 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/5.log" Apr 17 14:49:26.360845 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:26.360826 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:49:26.361050 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:26.361032 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:49:30.738493 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:30.738456 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:49:30.739013 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:30.738702 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:49:31.738884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:31.738847 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:49:31.739242 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:31.738928 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:49:31.739242 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:31.739077 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:49:31.739242 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:31.739120 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:49:33.739410 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:33.739377 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:49:34.396613 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:34.396584 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/5.log" Apr 17 14:49:34.397062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:34.397041 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/4.log" Apr 17 14:49:34.397442 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:34.397420 2582 generic.go:358] "Generic (PLEG): container finished" podID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" exitCode=2 Apr 17 14:49:34.397524 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:34.397487 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" event={"ID":"fe291b55-da84-48e3-9dfe-e7491ab46bdf","Type":"ContainerDied","Data":"ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9"} Apr 17 14:49:34.397584 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:34.397543 2582 scope.go:117] "RemoveContainer" containerID="6590174e69b91e9c0b569c388ee44d3589274c74f486c9f2b8486c09bc9ed720" Apr 17 14:49:34.398145 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:34.398123 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:49:34.398399 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:34.398376 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:49:35.254682 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:35.254651 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:49:35.254682 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:35.254685 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" Apr 17 14:49:35.403394 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:35.403360 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/5.log" Apr 17 14:49:35.404187 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:35.404164 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:49:35.404374 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:35.404356 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:49:39.738898 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:39.738853 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:49:39.739512 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:39.739060 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:49:41.739356 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:41.739317 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:49:41.739748 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:41.739493 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:49:44.738557 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:44.738518 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:49:44.738977 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:44.738689 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:49:46.738712 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:46.738675 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:49:47.459234 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:47.459204 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/5.log" Apr 17 14:49:47.459656 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:47.459638 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/4.log" Apr 17 14:49:47.460010 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:47.459980 2582 generic.go:358] "Generic (PLEG): container finished" podID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" exitCode=2 Apr 17 14:49:47.460093 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:47.460039 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" event={"ID":"f9a2b25f-f431-45a4-970d-44f3af0f7ec5","Type":"ContainerDied","Data":"5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914"} Apr 17 14:49:47.460093 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:47.460076 2582 scope.go:117] "RemoveContainer" containerID="4b854ea16ad9971fe05cd4e21cc7ff74764043088515d13ac46593e5e6b9d862" Apr 17 14:49:47.460607 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:47.460585 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:49:47.460912 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:47.460879 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:49:48.466854 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:48.466792 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/5.log" Apr 17 14:49:49.744235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:49.744201 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:49:49.744612 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:49.744394 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:49:54.560667 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:54.560628 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:49:54.560667 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:54.560671 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" Apr 17 14:49:54.561138 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:54.561122 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:49:54.561356 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:54.561339 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:49:54.738772 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:54.738738 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:49:54.738976 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:54.738958 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:49:55.739421 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:55.739389 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:49:56.508184 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.508152 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/5.log" Apr 17 14:49:56.508603 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.508586 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/4.log" Apr 17 14:49:56.508942 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.508923 2582 generic.go:358] "Generic (PLEG): container finished" podID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" exitCode=2 Apr 17 14:49:56.509013 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.508977 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" event={"ID":"d4e084cc-4e34-4e43-a5cc-8c48e232529a","Type":"ContainerDied","Data":"29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5"} Apr 17 14:49:56.509013 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.509010 2582 scope.go:117] "RemoveContainer" containerID="2677399c73df729ce033d94540d1d0bf75a52f0acdc28deefcf116ba05f3334f" Apr 17 14:49:56.509524 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.509500 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:49:56.509773 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:56.509755 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:49:56.739417 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.739388 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:49:56.739591 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:56.739555 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:49:56.752552 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.752525 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:49:56.752641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:56.752603 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" Apr 17 14:49:57.518326 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:57.518302 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/5.log" Apr 17 14:49:57.518991 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:57.518973 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:49:57.519185 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:57.519167 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:49:58.523726 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:49:58.523696 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:49:58.524147 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:49:58.523924 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:50:00.739235 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:00.739204 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:50:00.739602 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:00.739404 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:50:07.746243 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:07.746210 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:50:07.747136 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:07.747104 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:50:08.738980 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:08.738948 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:50:08.739185 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:08.739165 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:50:09.739078 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:09.739050 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:50:09.739485 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:09.739261 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:50:11.739001 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:11.738968 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:50:11.739454 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:11.739177 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:50:14.738650 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:14.738618 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:50:14.739059 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:14.738861 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:50:19.739268 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:19.739235 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:50:19.739657 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:19.739449 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:50:22.738858 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:22.738796 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:50:22.739235 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:22.739017 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:50:24.738715 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:24.738635 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:50:24.739126 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:24.738846 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:50:25.738412 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:25.738378 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:50:25.738627 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:25.738505 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:50:25.738705 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:25.738625 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:50:25.738705 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:25.738670 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:50:32.738690 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:32.738657 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:50:32.739087 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:32.738883 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:50:35.738354 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:35.738313 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:50:35.738836 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:35.738591 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:50:36.739010 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:36.738974 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:50:36.739400 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:36.739166 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:50:37.743223 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:37.743182 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:50:37.743710 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:37.743354 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:50:37.743710 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:37.743377 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:50:37.743710 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:37.743556 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:50:44.738553 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:44.738521 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:50:44.739222 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:44.738767 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:50:46.739080 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:46.739043 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:50:46.739473 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:46.739245 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:50:48.738903 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:48.738869 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:50:48.739341 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:48.739055 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:50:48.739341 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:48.739095 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:50:48.739341 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:48.739228 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:50:49.236398 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:49.236357 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/5.log" Apr 17 14:50:49.348302 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:49.348261 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/5.log" Apr 17 14:50:49.464646 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:49.464619 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/storage-initializer/0.log" Apr 17 14:50:50.738904 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:50.738874 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:50:50.739284 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:50.739043 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:50:54.698033 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:54.697993 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c585549bc-vpjwk_f87942f0-7ec2-4f54-a6a5-1b433d74a993/manager/0.log" Apr 17 14:50:55.680558 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.680526 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8_897b8f66-fa5f-4f88-8b95-fbe02babf7ba/util/0.log" Apr 17 14:50:55.690622 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.690596 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8_897b8f66-fa5f-4f88-8b95-fbe02babf7ba/pull/0.log" Apr 17 14:50:55.702330 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.702307 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8_897b8f66-fa5f-4f88-8b95-fbe02babf7ba/extract/0.log" Apr 17 14:50:55.738730 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.738699 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:50:55.738939 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:55.738915 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:50:55.815447 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.815411 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd_1768a071-915d-4020-99b0-ed685dda3a5c/util/0.log" Apr 17 14:50:55.824750 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.824730 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd_1768a071-915d-4020-99b0-ed685dda3a5c/pull/0.log" Apr 17 14:50:55.833058 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.833036 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd_1768a071-915d-4020-99b0-ed685dda3a5c/extract/0.log" Apr 17 14:50:55.945499 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.945414 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h_07774c38-aa20-409e-bfa2-e7a68c224bf6/util/0.log" Apr 17 14:50:55.955445 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.955418 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h_07774c38-aa20-409e-bfa2-e7a68c224bf6/pull/0.log" Apr 17 14:50:55.965332 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:55.965311 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h_07774c38-aa20-409e-bfa2-e7a68c224bf6/extract/0.log" Apr 17 14:50:56.077316 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.077270 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t_243de439-5be8-46ad-a8ce-3b20ee27fdb6/util/0.log" Apr 17 14:50:56.084679 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.084655 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t_243de439-5be8-46ad-a8ce-3b20ee27fdb6/pull/0.log" Apr 17 14:50:56.092330 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.092307 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t_243de439-5be8-46ad-a8ce-3b20ee27fdb6/extract/0.log" Apr 17 14:50:56.440079 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.440045 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-7kw9m_43e321c6-e135-4b2b-a15d-87bc335ca9ad/manager/0.log" Apr 17 14:50:56.660054 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.660020 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-lgvvl_85d9d156-7cd5-4964-bafa-90b8956c99e2/registry-server/0.log" Apr 17 14:50:56.780784 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.780754 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5_3106d538-3643-42c5-92cb-dbea4af3a682/manager/0.log" Apr 17 14:50:56.992989 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:56.992957 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-xk7f2_84c2a07d-f1ec-4180-ac4b-baf6c4323db5/manager/0.log" Apr 17 14:50:57.313506 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:57.313472 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh_22578833-852e-415f-929c-8d9010f87cee/istio-proxy/0.log" Apr 17 14:50:58.188121 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.188091 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/storage-initializer/0.log" Apr 17 14:50:58.194265 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.194239 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_fe291b55-da84-48e3-9dfe-e7491ab46bdf/main/5.log" Apr 17 14:50:58.306419 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.306387 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/storage-initializer/0.log" Apr 17 14:50:58.313531 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.313489 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_f9a2b25f-f431-45a4-970d-44f3af0f7ec5/main/5.log" Apr 17 14:50:58.420285 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.420253 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-zhl2l_86f00b01-b2bf-4097-8c2e-85946b07e499/main/0.log" Apr 17 14:50:58.427592 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.427565 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-zhl2l_86f00b01-b2bf-4097-8c2e-85946b07e499/storage-initializer/0.log" Apr 17 14:50:58.532042 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.532011 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/storage-initializer/0.log" Apr 17 14:50:58.539840 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.539792 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_f75b132f-d730-40ea-9dc0-96a3c4ccc878/main/5.log" Apr 17 14:50:58.649680 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.649636 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/storage-initializer/0.log" Apr 17 14:50:58.655968 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.655948 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_42ab39a2-9f59-485a-85b3-c767544c515b/main/5.log" Apr 17 14:50:58.739323 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.739283 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:50:58.739577 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:50:58.739554 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:50:58.763638 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.763609 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/storage-initializer/0.log" Apr 17 14:50:58.769676 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:50:58.769653 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-69mx7_d4e084cc-4e34-4e43-a5cc-8c48e232529a/main/5.log" Apr 17 14:51:00.738941 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:00.738903 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:51:00.739413 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:00.739184 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:51:01.739391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:01.739353 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:51:01.739896 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:01.739542 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:51:05.422391 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:05.422361 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z82rb_1c999004-4a51-40aa-b675-8213daef0914/global-pull-secret-syncer/0.log" Apr 17 14:51:05.517108 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:05.517080 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lvtf4_62d02888-cea1-4f15-b042-fb651835bf6a/konnectivity-agent/0.log" Apr 17 14:51:05.580536 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:05.580507 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-171.ec2.internal_342031fe6556563fbef6e8d55c3a781f/haproxy/0.log" Apr 17 14:51:05.739414 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:05.739330 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:51:05.739631 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:05.739607 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:51:09.164921 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.164886 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8_897b8f66-fa5f-4f88-8b95-fbe02babf7ba/extract/0.log" Apr 17 14:51:09.189274 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.189246 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8_897b8f66-fa5f-4f88-8b95-fbe02babf7ba/util/0.log" Apr 17 14:51:09.215289 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.215231 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759lwhk8_897b8f66-fa5f-4f88-8b95-fbe02babf7ba/pull/0.log" Apr 17 14:51:09.254887 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.254855 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd_1768a071-915d-4020-99b0-ed685dda3a5c/extract/0.log" Apr 17 14:51:09.274953 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.274924 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd_1768a071-915d-4020-99b0-ed685dda3a5c/util/0.log" Apr 17 14:51:09.297478 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.297448 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p8czd_1768a071-915d-4020-99b0-ed685dda3a5c/pull/0.log" Apr 17 14:51:09.328491 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.328466 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h_07774c38-aa20-409e-bfa2-e7a68c224bf6/extract/0.log" Apr 17 14:51:09.348776 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.348753 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h_07774c38-aa20-409e-bfa2-e7a68c224bf6/util/0.log" Apr 17 14:51:09.371734 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.371705 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed737dp4h_07774c38-aa20-409e-bfa2-e7a68c224bf6/pull/0.log" Apr 17 14:51:09.409505 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.409475 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t_243de439-5be8-46ad-a8ce-3b20ee27fdb6/extract/0.log" Apr 17 14:51:09.435403 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.435337 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t_243de439-5be8-46ad-a8ce-3b20ee27fdb6/util/0.log" Apr 17 14:51:09.458183 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.458154 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1j4m4t_243de439-5be8-46ad-a8ce-3b20ee27fdb6/pull/0.log" Apr 17 14:51:09.544255 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.544230 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-7kw9m_43e321c6-e135-4b2b-a15d-87bc335ca9ad/manager/0.log" Apr 17 14:51:09.597921 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.597883 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-lgvvl_85d9d156-7cd5-4964-bafa-90b8956c99e2/registry-server/0.log" Apr 17 14:51:09.642686 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.642653 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-bzjv5_3106d538-3643-42c5-92cb-dbea4af3a682/manager/0.log" Apr 17 14:51:09.697632 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.697502 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-xk7f2_84c2a07d-f1ec-4180-ac4b-baf6c4323db5/manager/0.log" Apr 17 14:51:09.739516 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:09.739478 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:51:09.739709 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:09.739634 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:51:11.128233 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.128203 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4mfpc_1ccbfbc4-5994-4305-ad57-c1b4c21d6b7e/cluster-monitoring-operator/0.log" Apr 17 14:51:11.398245 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.398151 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h9wzt_ab6693d0-6bf3-4a27-be94-c542f254b76b/node-exporter/0.log" Apr 17 14:51:11.419090 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.419061 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h9wzt_ab6693d0-6bf3-4a27-be94-c542f254b76b/kube-rbac-proxy/0.log" Apr 17 14:51:11.439546 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.439512 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h9wzt_ab6693d0-6bf3-4a27-be94-c542f254b76b/init-textfile/0.log" Apr 17 14:51:11.738567 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.738484 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:51:11.738716 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:11.738686 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:51:11.784441 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.784408 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bzgd8_5881f6c9-b857-44e4-b059-69f64682df56/prometheus-operator/0.log" Apr 17 14:51:11.804496 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.804463 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bzgd8_5881f6c9-b857-44e4-b059-69f64682df56/kube-rbac-proxy/0.log" Apr 17 14:51:11.831582 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.831553 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-lsckd_9d8aa6c7-c009-4909-982d-1c27652a9903/prometheus-operator-admission-webhook/0.log" Apr 17 14:51:11.944168 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.944135 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc8496d4f-vbjg9_1f793873-40a8-4eec-908a-2d5e2bdf7aa9/thanos-query/0.log" Apr 17 14:51:11.965230 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.965161 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc8496d4f-vbjg9_1f793873-40a8-4eec-908a-2d5e2bdf7aa9/kube-rbac-proxy-web/0.log" Apr 17 14:51:11.986207 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:11.986153 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc8496d4f-vbjg9_1f793873-40a8-4eec-908a-2d5e2bdf7aa9/kube-rbac-proxy/0.log" Apr 17 14:51:12.007163 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:12.007130 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc8496d4f-vbjg9_1f793873-40a8-4eec-908a-2d5e2bdf7aa9/prom-label-proxy/0.log" Apr 17 14:51:12.029238 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:12.029205 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc8496d4f-vbjg9_1f793873-40a8-4eec-908a-2d5e2bdf7aa9/kube-rbac-proxy-rules/0.log" Apr 17 14:51:12.056661 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:12.056632 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc8496d4f-vbjg9_1f793873-40a8-4eec-908a-2d5e2bdf7aa9/kube-rbac-proxy-metrics/0.log" Apr 17 14:51:13.691874 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.691838 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/2.log" Apr 17 14:51:13.696123 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.696100 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bss6z_8cf1a0e0-95fc-4bce-84ed-e3222e5beeb2/console-operator/3.log" Apr 17 14:51:13.739303 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.739267 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:51:13.739506 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.739390 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:51:13.739575 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:13.739526 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:51:13.739575 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:13.739540 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:51:13.987494 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.987398 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw"] Apr 17 14:51:13.991432 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.991405 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:13.994199 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.994166 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fvwkz\"/\"kube-root-ca.crt\"" Apr 17 14:51:13.995082 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.995061 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fvwkz\"/\"openshift-service-ca.crt\"" Apr 17 14:51:13.995173 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:13.995068 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fvwkz\"/\"default-dockercfg-p84cm\"" Apr 17 14:51:14.001422 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.001389 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw"] Apr 17 14:51:14.080678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.080635 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-podres\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.080678 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.080681 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-sys\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.080981 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.080867 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-lib-modules\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.080981 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.080931 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-proc\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.081062 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.081019 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ms9r\" (UniqueName: \"kubernetes.io/projected/188a0b92-3e02-4983-98ee-78c58884a90a-kube-api-access-9ms9r\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.152476 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.152444 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7648f79794-9hjrz_2a3e1b8e-c193-404a-8b22-503df0af5366/console/0.log" Apr 17 14:51:14.179834 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.179772 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rgv47_5ed0f5e3-c06f-4fe3-9938-13df41a47562/download-server/0.log" Apr 17 14:51:14.181503 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181474 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-proc\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181532 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ms9r\" (UniqueName: \"kubernetes.io/projected/188a0b92-3e02-4983-98ee-78c58884a90a-kube-api-access-9ms9r\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181583 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-podres\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181600 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-proc\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181610 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-sys\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181852 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181673 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-sys\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181852 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181719 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-podres\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181852 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181756 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-lib-modules\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.181954 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.181916 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/188a0b92-3e02-4983-98ee-78c58884a90a-lib-modules\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.189885 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.189849 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ms9r\" (UniqueName: \"kubernetes.io/projected/188a0b92-3e02-4983-98ee-78c58884a90a-kube-api-access-9ms9r\") pod \"perf-node-gather-daemonset-mg4qw\" (UID: \"188a0b92-3e02-4983-98ee-78c58884a90a\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.303367 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.303329 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.437772 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.437733 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw"] Apr 17 14:51:14.439041 ip-10-0-143-171 kubenswrapper[2582]: W0417 14:51:14.439009 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod188a0b92_3e02_4983_98ee_78c58884a90a.slice/crio-08b1e227448a2a283c825767b4825ec45022de8cac767c1322672995a2cf0947 WatchSource:0}: Error finding container 08b1e227448a2a283c825767b4825ec45022de8cac767c1322672995a2cf0947: Status 404 returned error can't find the container with id 08b1e227448a2a283c825767b4825ec45022de8cac767c1322672995a2cf0947 Apr 17 14:51:14.440896 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.440877 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:51:14.682254 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.682227 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-n2jzl_6bab6f40-808e-4200-a47c-8888edae71b6/volume-data-source-validator/0.log" Apr 17 14:51:14.859037 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.858941 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" event={"ID":"188a0b92-3e02-4983-98ee-78c58884a90a","Type":"ContainerStarted","Data":"3e92152a285bb87649d7b64eabc05ae620f8aae63e225107fadcbfdfeb460369"} Apr 17 14:51:14.859037 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.858984 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" event={"ID":"188a0b92-3e02-4983-98ee-78c58884a90a","Type":"ContainerStarted","Data":"08b1e227448a2a283c825767b4825ec45022de8cac767c1322672995a2cf0947"} Apr 17 14:51:14.859037 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.859008 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:14.877259 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:14.877199 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" podStartSLOduration=1.877180426 podStartE2EDuration="1.877180426s" podCreationTimestamp="2026-04-17 14:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:51:14.875610643 +0000 UTC m=+1027.742996295" watchObservedRunningTime="2026-04-17 14:51:14.877180426 +0000 UTC m=+1027.744566076" Apr 17 14:51:15.521004 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:15.520973 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kj9w5_fe006aa3-2754-4ceb-acc9-c8189d25053b/dns/0.log" Apr 17 14:51:15.543147 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:15.543115 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kj9w5_fe006aa3-2754-4ceb-acc9-c8189d25053b/kube-rbac-proxy/0.log" Apr 17 14:51:15.650463 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:15.650437 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m5qlh_542f49ba-8bb4-4178-9c98-a94bc1f60de1/dns-node-resolver/0.log" Apr 17 14:51:16.174854 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:16.174825 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rhww7_82ff5ce9-528b-4d19-9a09-ac7e64ef9d46/node-ca/0.log" Apr 17 14:51:17.081556 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:17.081517 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfjsvgh_22578833-852e-415f-929c-8d9010f87cee/istio-proxy/0.log" Apr 17 14:51:17.709131 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:17.709106 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-87v7h_76590649-d620-489b-9a3c-5c78ec32d35e/serve-healthcheck-canary/0.log" Apr 17 14:51:17.742174 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:17.742140 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:51:17.742417 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:17.742395 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:51:18.169625 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:18.169592 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-9kpqq_c73897be-24cb-49ee-a735-e2c35eb461f4/insights-operator/1.log" Apr 17 14:51:18.169996 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:18.169976 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-9kpqq_c73897be-24cb-49ee-a735-e2c35eb461f4/insights-operator/0.log" Apr 17 14:51:18.258266 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:18.258239 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kqx2d_d0785d07-4d16-482b-a416-61f38c7665f3/kube-rbac-proxy/0.log" Apr 17 14:51:18.284495 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:18.284459 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kqx2d_d0785d07-4d16-482b-a416-61f38c7665f3/exporter/0.log" Apr 17 14:51:18.306250 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:18.306216 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kqx2d_d0785d07-4d16-482b-a416-61f38c7665f3/extractor/0.log" Apr 17 14:51:20.339255 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:20.339220 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c585549bc-vpjwk_f87942f0-7ec2-4f54-a6a5-1b433d74a993/manager/0.log" Apr 17 14:51:20.873852 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:20.873827 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-mg4qw" Apr 17 14:51:21.455404 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:21.455369 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-66f567c4b6-b4tb6_a9b3f507-f98e-4518-bd22-ab9ce481958d/manager/0.log" Apr 17 14:51:21.500927 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:21.500899 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-c46sl_1e2a754c-4d22-4c25-b310-d76568637da3/openshift-lws-operator/0.log" Apr 17 14:51:22.739237 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:22.739148 2582 scope.go:117] "RemoveContainer" containerID="ab1e5398c422006b64928b2cffa6dd22f0117fd171ad33517b2f69dd07f077a7" Apr 17 14:51:22.739596 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:22.739299 2582 scope.go:117] "RemoveContainer" containerID="5ad2dd144826514e70299e36fd8d07dc0be349a920aee5fb6dd6a20525659914" Apr 17 14:51:22.739596 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:22.739412 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8_llm(f75b132f-d730-40ea-9dc0-96a3c4ccc878)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8k8lr8" podUID="f75b132f-d730-40ea-9dc0-96a3c4ccc878" Apr 17 14:51:22.739596 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:22.739463 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-simulated-kserve-8485d77cdf-cjhgj_llm(f9a2b25f-f431-45a4-970d-44f3af0f7ec5)\"" pod="llm/e2e-distinct-simulated-kserve-8485d77cdf-cjhgj" podUID="f9a2b25f-f431-45a4-970d-44f3af0f7ec5" Apr 17 14:51:25.852195 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:25.852165 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kfqfw_c3698900-25ed-4e05-b420-0a5a402ecaac/migrator/0.log" Apr 17 14:51:25.872551 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:25.872521 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kfqfw_c3698900-25ed-4e05-b420-0a5a402ecaac/graceful-termination/0.log" Apr 17 14:51:26.194185 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:26.194096 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qq9b8_84ae5f81-3a81-49e9-a3fb-cad167d0281b/kube-storage-version-migrator-operator/1.log" Apr 17 14:51:26.195136 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:26.195113 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qq9b8_84ae5f81-3a81-49e9-a3fb-cad167d0281b/kube-storage-version-migrator-operator/0.log" Apr 17 14:51:26.739095 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:26.739061 2582 scope.go:117] "RemoveContainer" containerID="29b3d39c1858ea4b8584e5fc5524dcbae9729efcad49a307a6bc7220cec4a4b5" Apr 17 14:51:26.739344 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:26.739326 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-69mx7_llm(d4e084cc-4e34-4e43-a5cc-8c48e232529a)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-69mx7" podUID="d4e084cc-4e34-4e43-a5cc-8c48e232529a" Apr 17 14:51:27.141641 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.141611 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zcj9_7f84353d-e913-4a0e-94b9-1138b03b1814/kube-multus/0.log" Apr 17 14:51:27.320699 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.320667 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/kube-multus-additional-cni-plugins/0.log" Apr 17 14:51:27.342781 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.342752 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/egress-router-binary-copy/0.log" Apr 17 14:51:27.363673 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.363629 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/cni-plugins/0.log" Apr 17 14:51:27.383904 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.383872 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/bond-cni-plugin/0.log" Apr 17 14:51:27.410295 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.410212 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/routeoverride-cni/0.log" Apr 17 14:51:27.431321 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.431292 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/whereabouts-cni-bincopy/0.log" Apr 17 14:51:27.452778 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.452751 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5kzp2_175f6a59-d17b-42f0-b454-ff9a315c3d7a/whereabouts-cni/0.log" Apr 17 14:51:27.688884 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.688778 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4kdjq_fbcf40f6-2ec0-4fb3-85d8-30ecb284384d/network-metrics-daemon/0.log" Apr 17 14:51:27.708712 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:27.708677 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4kdjq_fbcf40f6-2ec0-4fb3-85d8-30ecb284384d/kube-rbac-proxy/0.log" Apr 17 14:51:28.574702 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.574653 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/ovn-controller/0.log" Apr 17 14:51:28.594927 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.594898 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/ovn-acl-logging/0.log" Apr 17 14:51:28.612768 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.612741 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/kube-rbac-proxy-node/0.log" Apr 17 14:51:28.632212 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.632186 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:51:28.650141 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.650111 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/northd/0.log" Apr 17 14:51:28.669829 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.669779 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/nbdb/0.log" Apr 17 14:51:28.690560 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.690539 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/sbdb/0.log" Apr 17 14:51:28.738359 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.738324 2582 scope.go:117] "RemoveContainer" containerID="ec39e091c63f1ca1adb9d5b2ef61f44d3e437830192ebc4505d5d1b4f14b36b9" Apr 17 14:51:28.738548 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.738434 2582 scope.go:117] "RemoveContainer" containerID="0d397b127d7933cc4119724b54da1485bfc53f05085b9764b8ae7284a9c25367" Apr 17 14:51:28.738548 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:28.738540 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5_llm(fe291b55-da84-48e3-9dfe-e7491ab46bdf)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-s69n5" podUID="fe291b55-da84-48e3-9dfe-e7491ab46bdf" Apr 17 14:51:28.738662 ip-10-0-143-171 kubenswrapper[2582]: E0417 14:51:28.738583 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w_llm(42ab39a2-9f59-485a-85b3-c767544c515b)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-xpp6w" podUID="42ab39a2-9f59-485a-85b3-c767544c515b" Apr 17 14:51:28.791140 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:28.791101 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5brp_f219cfa3-6451-4834-b849-5d264acf8bac/ovnkube-controller/0.log" Apr 17 14:51:30.378878 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:30.378853 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-rl7lx_93ac31c3-23e6-4891-a746-c22dabdbc864/check-endpoints/0.log" Apr 17 14:51:30.423645 ip-10-0-143-171 kubenswrapper[2582]: I0417 14:51:30.423608 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jfnzx_efde8bcb-629a-4cd7-9fe4-cea71e67b06e/network-check-target-container/0.log"