Apr 17 16:28:39.188693 ip-10-0-131-177 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:28:39.188711 ip-10-0-131-177 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:28:39.188721 ip-10-0-131-177 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:28:39.189048 ip-10-0-131-177 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:28:49.427356 ip-10-0-131-177 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:28:49.427371 ip-10-0-131-177 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e93eab51c8ce4fc694e498bbd61a6862 -- Apr 17 16:31:17.305017 ip-10-0-131-177 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:17.706346 ip-10-0-131-177 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:17.706346 ip-10-0-131-177 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:17.706346 ip-10-0-131-177 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:17.706346 ip-10-0-131-177 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:17.706346 ip-10-0-131-177 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:17.707275 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.707187 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:17.712650 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712634 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:17.712650 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712650 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712653 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712656 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712660 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712663 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712666 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712668 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712671 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712673 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712676 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712679 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712682 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712684 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712686 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712689 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712691 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712694 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712697 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712699 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712702 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:17.712734 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712706 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712709 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712711 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712728 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712731 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712735 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712738 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712740 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712743 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712746 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712749 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712751 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712754 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712756 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712759 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712762 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712764 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712766 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712769 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712771 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:17.713219 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712774 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712776 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712779 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712781 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712783 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712786 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712788 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712791 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712793 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712795 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712798 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712802 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712806 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712811 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712815 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712818 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712821 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712824 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712826 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:17.713695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712829 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712831 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712834 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712836 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712839 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712841 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712844 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712846 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712849 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712851 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712854 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712856 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712860 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712862 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712865 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712867 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712870 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712873 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712875 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712878 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:17.714157 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712880 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712883 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712886 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712889 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712892 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.712894 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713263 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713268 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713271 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713274 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713276 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713279 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713281 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713284 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713286 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713291 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713294 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713297 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713300 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:17.714624 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713303 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713306 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713308 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713311 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713314 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713316 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713319 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713322 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713325 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713329 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713332 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713334 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713337 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713339 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713342 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713344 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713347 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713349 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713352 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713354 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:17.715092 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713359 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713361 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713364 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713366 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713369 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713371 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713374 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713376 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713379 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713381 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713383 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713386 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713389 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713392 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713394 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713397 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713399 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713402 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713405 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:17.715648 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713407 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713411 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713413 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713416 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713418 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713421 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713423 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713425 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713428 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713430 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713433 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713436 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713440 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713443 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713446 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713448 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713451 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713453 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713455 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713458 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:17.716162 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713460 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713462 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713465 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713467 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713470 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713472 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713475 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713477 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713480 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713483 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713486 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713489 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713491 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.713494 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713580 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713591 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713600 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713606 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713610 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713613 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713617 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:17.716653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713621 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713625 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713628 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713631 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713634 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713637 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713641 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713644 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713646 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713649 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713652 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713655 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713659 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713662 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713665 2572 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713668 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713671 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713675 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713678 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713681 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713684 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713688 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713691 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713694 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713697 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:17.717269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713700 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713705 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713708 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713711 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713727 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713730 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713733 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713737 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713740 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713743 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713746 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713749 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713753 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713755 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713759 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713762 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713765 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713768 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713771 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713773 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713776 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713779 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713782 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713785 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713788 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:17.717912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713791 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713795 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713799 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713802 2572 flags.go:64] FLAG: --help="false" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713805 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-131-177.ec2.internal" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713807 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713810 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713813 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713816 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713819 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713822 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713825 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713827 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713830 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713833 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713836 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713839 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713842 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713844 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713847 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713850 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713853 2572 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713856 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713859 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:17.718517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713862 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713867 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713870 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713872 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713875 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713878 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713881 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713884 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713886 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713890 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713897 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713901 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713904 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713907 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713910 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713913 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713916 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713919 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713921 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713929 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713932 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713935 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713938 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:17.719139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713941 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713947 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713949 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713952 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713955 2572 flags.go:64] FLAG: --port="10250" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713958 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713961 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01b43d68fd872eaad" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713964 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713968 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713970 2572 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713973 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713976 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713980 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713983 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713985 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713988 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713991 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713995 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.713997 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714000 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714005 2572 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714007 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714010 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714013 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714016 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714019 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:17.719680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714022 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714025 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714028 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714031 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714033 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714036 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714039 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714042 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714045 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714047 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714052 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714055 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714058 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714062 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714065 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714075 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714078 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714081 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714084 2572 flags.go:64] FLAG: --v="2" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714092 2572 flags.go:64] FLAG: --version="false" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714100 2572 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714104 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.714108 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714202 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:17.720329 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714206 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714209 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714214 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714217 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714219 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714222 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714224 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714227 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714229 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714232 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714235 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714237 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714240 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714244 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714247 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714250 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714253 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714255 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714258 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:17.720943 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714261 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714264 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714266 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714269 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714271 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714274 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714277 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714280 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714282 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714284 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714287 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714289 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714291 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714294 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714299 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714304 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714307 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714310 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714312 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:17.721427 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714314 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714317 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714319 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714322 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714325 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714328 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714330 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714332 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714335 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714337 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714339 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714342 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714345 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714347 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714350 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714352 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714354 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714357 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714360 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714362 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:17.722254 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714365 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714367 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714370 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714372 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714375 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714377 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714380 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714383 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714387 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714390 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714392 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714395 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714397 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714399 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714402 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714404 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714407 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714409 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714412 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714414 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:17.723084 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714417 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714419 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714422 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714424 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714427 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714429 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.714432 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:17.723871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.715052 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:17.724496 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.724477 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:17.724528 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.724497 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:17.724557 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724548 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:17.724557 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724553 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:17.724557 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724557 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724560 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724563 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724566 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724569 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724572 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724575 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724578 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724581 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724583 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724586 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724589 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724592 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724594 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724597 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724599 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724602 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724605 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724607 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724610 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:17.724634 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724613 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724617 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724621 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724623 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724626 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724629 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724632 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724635 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724638 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724642 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724646 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724650 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724653 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724655 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724658 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724661 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724664 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724666 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724669 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:17.725220 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724671 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724674 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724677 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724679 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724682 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724685 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724687 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724690 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724692 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724695 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724697 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724700 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724702 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724704 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724707 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724710 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724734 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724739 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724742 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724745 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:17.725729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724748 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724750 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724753 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724756 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724759 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724761 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724763 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724766 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724769 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724771 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724774 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724776 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724779 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724781 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724784 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724787 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724790 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724792 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724795 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724797 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:17.726222 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724800 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724802 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724805 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724808 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724810 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.724815 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724910 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724916 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724919 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724922 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724924 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724927 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724929 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724933 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724937 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:17.726704 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724940 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724943 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724945 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724948 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724951 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724953 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724956 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724958 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724961 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724963 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724966 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724968 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724971 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724974 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724977 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724980 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724982 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724985 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724987 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:17.727093 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724990 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724992 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724995 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724997 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.724999 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725003 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725006 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725008 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725011 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725013 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725016 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725018 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725021 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725024 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725026 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725028 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725031 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725033 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725036 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725038 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:17.727543 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725040 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725043 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725045 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725048 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725050 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725053 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725055 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725058 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725061 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725063 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725066 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725068 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725071 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725073 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725075 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725079 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725083 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725085 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725088 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725092 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:17.728042 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725094 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725097 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725099 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725102 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725104 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725107 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725109 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725112 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725114 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725116 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725119 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725121 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725124 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725126 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725128 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725131 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725133 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:17.728531 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:17.725135 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:17.728990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.725140 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:17.728990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.725277 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:17.728990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.727152 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:17.728990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.727980 2572 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:17.728990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.728072 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:17.728990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.728428 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:17.749961 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.749941 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:17.754759 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.754735 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:17.766774 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.766753 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:17.771456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.771442 2572 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:17.774340 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.774322 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:17.779384 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.779365 2572 fs.go:135] Filesystem UUIDs: map[00e57168-e996-40ab-9fd0-65e8c71724b4:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ee25b53d-07b5-48af-979d-d393e0a60c3a:/dev/nvme0n1p4] Apr 17 16:31:17.779445 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.779384 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:17.782249 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.782233 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:17.784925 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.784818 2572 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:17.782834876 +0000 UTC m=+0.370423837 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099601 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bfbbfab649869f2b98cf68a59c54e SystemUUID:ec2bfbbf-ab64-9869-f2b9-8cf68a59c54e BootID:e93eab51-c8ce-4fc6-94e4-98bbd61a6862 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:08:1c:af:96:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:08:1c:af:96:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:f8:76:18:9a:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:17.784925 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.784923 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:17.785044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.784994 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:17.785867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.785844 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:17.785999 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.785869 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-177.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:17.786044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.786009 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:17.786044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.786018 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:17.786044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.786034 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:17.786751 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.786741 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:17.787995 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.787985 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:17.788193 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.788172 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:17.792984 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.792969 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:17.793050 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.792987 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:17.793050 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.793000 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:17.793050 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.793010 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:17.793050 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.793019 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:17.793948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.793937 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:17.793985 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.793955 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:17.796708 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.796691 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:17.797946 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.797930 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:17.799550 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799535 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799556 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799565 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799572 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799580 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799588 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799595 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799604 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799614 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799623 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:17.799636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799635 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:17.799942 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.799647 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:17.800571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.800559 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:17.800626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.800574 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:17.804078 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.804063 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:17.804175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.804105 2572 server.go:1295] "Started kubelet" Apr 17 16:31:17.804231 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.804201 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:17.804303 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.804266 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:17.804342 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.804322 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:17.804945 ip-10-0-131-177 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:17.805471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.805319 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:17.805892 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.805884 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:17.806936 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.806913 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:17.807055 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.807037 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-177.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:17.807187 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.807171 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-177.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:17.809809 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.809792 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:17.809906 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.809829 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:17.810482 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.810465 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:17.810538 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.810484 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:17.810588 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.810577 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:17.810637 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.810628 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:17.810667 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.810639 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:17.810952 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.810935 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vrxzj" Apr 17 16:31:17.811015 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.809915 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-177.ec2.internal.18a731ee5a124d50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-177.ec2.internal,UID:ip-10-0-131-177.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-177.ec2.internal,},FirstTimestamp:2026-04-17 16:31:17.804076368 +0000 UTC m=+0.391665333,LastTimestamp:2026-04-17 16:31:17.804076368 +0000 UTC m=+0.391665333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-177.ec2.internal,}" Apr 17 16:31:17.811015 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.811006 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:17.811133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.811082 2572 factory.go:55] Registering systemd factory Apr 17 16:31:17.811133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.811128 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:17.812845 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.812127 2572 factory.go:153] Registering CRI-O factory Apr 17 16:31:17.812845 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.812146 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:17.812845 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.812212 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:17.812845 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.812231 2572 factory.go:103] Registering Raw factory Apr 17 16:31:17.812845 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.812244 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:17.812845 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.812651 2572 manager.go:319] Starting recovery of all containers Apr 17 16:31:17.818256 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.818222 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-177.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:31:17.818583 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.818465 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:31:17.818908 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.818886 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vrxzj" Apr 17 16:31:17.825059 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.825042 2572 manager.go:324] Recovery completed Apr 17 16:31:17.829022 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.829010 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:17.831241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.831224 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:17.831324 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.831251 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:17.831324 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.831260 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:17.831686 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.831673 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:17.831686 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.831686 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:17.831794 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.831700 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:17.833191 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.833132 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-177.ec2.internal.18a731ee5bb0c794 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-177.ec2.internal,UID:ip-10-0-131-177.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-177.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-177.ec2.internal,},FirstTimestamp:2026-04-17 16:31:17.831239572 +0000 UTC m=+0.418828532,LastTimestamp:2026-04-17 16:31:17.831239572 +0000 UTC m=+0.418828532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-177.ec2.internal,}" Apr 17 16:31:17.834173 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.834158 2572 policy_none.go:49] "None policy: Start" Apr 17 16:31:17.834173 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.834174 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:17.834271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.834183 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:17.875272 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875256 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.875336 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875353 2572 server.go:85] "Starting device plugin registration server" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875575 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875585 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875747 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875826 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.875838 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.876252 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:17.880162 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.876283 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:17.967206 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.967141 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:17.968455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.968436 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:17.968531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.968458 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:17.968531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.968473 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:17.968531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.968481 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:17.968531 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.968509 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:17.970983 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.970964 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:17.976301 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.976287 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:17.977529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.977513 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:17.977603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.977539 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:17.977603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.977550 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:17.977603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.977578 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-177.ec2.internal" Apr 17 16:31:17.986146 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:17.986133 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-177.ec2.internal" Apr 17 16:31:17.986199 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:17.986153 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-177.ec2.internal\": node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.001761 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.001740 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.069311 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.069289 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal"] Apr 17 16:31:18.069409 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.069350 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:18.070244 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.070232 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:18.070309 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.070257 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:18.070309 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.070266 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:18.071605 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.071593 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:18.071815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.071801 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.071859 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.071829 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:18.072370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.072355 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:18.072370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.072368 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:18.072483 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.072384 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:18.072483 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.072391 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:18.072483 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.072397 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:18.072483 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.072404 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:18.073425 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.073410 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.073501 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.073437 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:18.074172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.074156 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:18.074238 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.074181 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:18.074238 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.074191 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:18.089256 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.089238 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-177.ec2.internal\" not found" node="ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.093084 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.093062 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-177.ec2.internal\" not found" node="ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.102319 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.102304 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.112194 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.112174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e3c596a27faede1f97b6bb0972592f6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal\" (UID: \"7e3c596a27faede1f97b6bb0972592f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.112262 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.112200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e3c596a27faede1f97b6bb0972592f6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal\" (UID: \"7e3c596a27faede1f97b6bb0972592f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.112262 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.112222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/50b2c37a28961f1c8aacb6ad5db58d22-config\") pod \"kube-apiserver-proxy-ip-10-0-131-177.ec2.internal\" (UID: \"50b2c37a28961f1c8aacb6ad5db58d22\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.203028 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.203006 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.213080 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.213058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e3c596a27faede1f97b6bb0972592f6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal\" (UID: \"7e3c596a27faede1f97b6bb0972592f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.213139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.213081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/50b2c37a28961f1c8aacb6ad5db58d22-config\") pod \"kube-apiserver-proxy-ip-10-0-131-177.ec2.internal\" (UID: \"50b2c37a28961f1c8aacb6ad5db58d22\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.213139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.213098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e3c596a27faede1f97b6bb0972592f6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal\" (UID: \"7e3c596a27faede1f97b6bb0972592f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.213139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.213106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e3c596a27faede1f97b6bb0972592f6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal\" (UID: \"7e3c596a27faede1f97b6bb0972592f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.213139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.213119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e3c596a27faede1f97b6bb0972592f6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal\" (UID: \"7e3c596a27faede1f97b6bb0972592f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.213274 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.213152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/50b2c37a28961f1c8aacb6ad5db58d22-config\") pod \"kube-apiserver-proxy-ip-10-0-131-177.ec2.internal\" (UID: \"50b2c37a28961f1c8aacb6ad5db58d22\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.303549 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.303491 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.390966 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.390936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.395432 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.395392 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" Apr 17 16:31:18.403974 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.403957 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.504561 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.504528 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.605086 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.605022 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.705496 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.705475 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.727927 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.727909 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:18.728412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.728037 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:18.806441 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.806410 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.810878 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.810863 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:18.823529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.823488 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:17 +0000 UTC" deadline="2027-11-24 04:13:04.298844422 +0000 UTC" Apr 17 16:31:18.823529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.823522 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14051h41m45.475326211s" Apr 17 16:31:18.825406 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.825392 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:18.828064 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.828046 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:18.851698 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.851673 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cjrwd" Apr 17 16:31:18.859433 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:18.859386 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cjrwd" Apr 17 16:31:18.906509 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:18.906468 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:18.998275 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:18.998243 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3c596a27faede1f97b6bb0972592f6.slice/crio-ca3de6d38b93b6c399871987b2853f5573e3834d414e34567593e4bec6b3fc86 WatchSource:0}: Error finding container ca3de6d38b93b6c399871987b2853f5573e3834d414e34567593e4bec6b3fc86: Status 404 returned error can't find the container with id ca3de6d38b93b6c399871987b2853f5573e3834d414e34567593e4bec6b3fc86 Apr 17 16:31:18.998646 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:18.998627 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b2c37a28961f1c8aacb6ad5db58d22.slice/crio-1f86ed160e0181393b131a1660b378c043fa4c43b768c748a06d578c2d18b76e WatchSource:0}: Error finding container 1f86ed160e0181393b131a1660b378c043fa4c43b768c748a06d578c2d18b76e: Status 404 returned error can't find the container with id 1f86ed160e0181393b131a1660b378c043fa4c43b768c748a06d578c2d18b76e Apr 17 16:31:19.004234 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.004216 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:19.007535 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:19.007517 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:19.085187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.085162 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:19.107877 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:19.107858 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:19.208374 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:19.208316 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:19.308997 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:19.308963 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-177.ec2.internal\" not found" Apr 17 16:31:19.333578 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.333557 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:19.411358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.411329 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" Apr 17 16:31:19.425673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.425647 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:19.426471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.426453 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" Apr 17 16:31:19.438810 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.438787 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:19.794162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.793941 2572 apiserver.go:52] "Watching apiserver" Apr 17 16:31:19.800581 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.800555 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:19.801699 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.801662 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-clpwx","openshift-cluster-node-tuning-operator/tuned-9nxzl","openshift-dns/node-resolver-5q2mv","openshift-image-registry/node-ca-gmh4h","openshift-multus/multus-additional-cni-plugins-mnjjm","openshift-multus/multus-kxp4l","openshift-ovn-kubernetes/ovnkube-node-vj4bz","kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal","openshift-multus/network-metrics-daemon-vw79z","openshift-network-diagnostics/network-check-target-xrrkr","openshift-network-operator/iptables-alerter-qg47l"] Apr 17 16:31:19.804641 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.804618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.805628 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.805607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.805793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.805772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.807001 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.806979 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:19.807112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807088 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.807518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807199 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:19.807518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807297 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zwp8b\"" Apr 17 16:31:19.807518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807310 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.807518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807364 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.807518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807432 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:19.808271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807909 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d5k84\"" Apr 17 16:31:19.808271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807946 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:19.808271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.807950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-whnrn\"" Apr 17 16:31:19.808271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.808147 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:19.808652 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.808629 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.809080 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.809060 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.809080 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.809072 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:19.809330 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.809315 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:19.809502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.809489 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:19.810909 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.810055 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:19.810909 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.810238 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.810909 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.810360 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.810909 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.810634 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.810909 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.810846 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-llzzq\"" Apr 17 16:31:19.811350 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.811332 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.812388 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.812201 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.812736 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.812702 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8lg5q\"" Apr 17 16:31:19.812977 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.812867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.812977 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.812951 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.813862 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.813843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:19.814148 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.814057 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:19.814363 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.814341 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7kkll\"" Apr 17 16:31:19.814812 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.814795 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lv9n7\"" Apr 17 16:31:19.814998 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.814980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:19.816440 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:19.815291 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:19.816440 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.815489 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.816440 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.815568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:19.816440 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.815491 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.819053 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.819034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.819407 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.819391 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:19.819500 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:19.819455 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:19.821009 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.820993 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.821316 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:19.821425 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821370 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.821661 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-kubernetes\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.821766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-host\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.821766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-systemd\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.821766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-run-netns\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.821766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-env-overrides\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.821766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-log-socket\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821772 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-kubelet\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821786 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-cni-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-hostroot\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-conf-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821836 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-daemon-config\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821849 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-tuned\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-cni-multus\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-system-cni-dir\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-slash\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-etc-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.821948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-run-ovn-kubernetes\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-cnibin\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-netns\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-var-lib-kubelet\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.821997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-node-log\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff70e5e0-9bde-4a87-af27-6726427e4ba4-cni-binary-copy\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-multus-certs\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysconfig\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822047 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6pmsf\"" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysctl-conf\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tllmk\" (UniqueName: \"kubernetes.io/projected/f0075f81-88ff-4518-bd4e-bc50656a593b-kube-api-access-tllmk\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822107 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-k8s-cni-cncf-io\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-host\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-ovn\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0075f81-88ff-4518-bd4e-bc50656a593b-ovn-node-metrics-cert\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e7e38ded-7b99-4b86-9ba3-f0cdd8e37344-konnectivity-ca\") pod \"konnectivity-agent-clpwx\" (UID: \"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344\") " pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cni-binary-copy\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.822317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822269 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-os-release\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-cni-bin\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-modprobe-d\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822443 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-lib-modules\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-ovnkube-config\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-run\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cnibin\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbr4\" (UniqueName: \"kubernetes.io/projected/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-kube-api-access-vzbr4\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-serviceca\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-hosts-file\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-tmp-dir\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qtd\" (UniqueName: \"kubernetes.io/projected/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-kube-api-access-22qtd\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlm4\" (UniqueName: \"kubernetes.io/projected/428c05e5-0a0b-4a8c-8239-83701e670fe3-kube-api-access-7mlm4\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.822934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-kubelet\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-systemd-units\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.822972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-systemd\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823002 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-socket-dir-parent\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-var-lib-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-cni-netd\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-ovnkube-script-lib\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/428c05e5-0a0b-4a8c-8239-83701e670fe3-tmp\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-cni-bin\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-etc-kubernetes\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysctl-d\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-system-cni-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e7e38ded-7b99-4b86-9ba3-f0cdd8e37344-agent-certs\") pod \"konnectivity-agent-clpwx\" (UID: \"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344\") " pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-csnw6\"" Apr 17 16:31:19.823559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfsl\" (UniqueName: \"kubernetes.io/projected/ff70e5e0-9bde-4a87-af27-6726427e4ba4-kube-api-access-4kfsl\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-sys\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-os-release\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdt9w\" (UniqueName: \"kubernetes.io/projected/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-kube-api-access-vdt9w\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823528 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823700 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:19.824241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.823862 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:19.843253 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.843237 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:19.861447 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.861425 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:18 +0000 UTC" deadline="2027-12-04 11:17:20.475700727 +0000 UTC" Apr 17 16:31:19.861560 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.861447 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14298h46m0.614256897s" Apr 17 16:31:19.911516 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.911485 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:19.923875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-systemd\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.923997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-run-netns\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.923997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-env-overrides\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.923997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-socket-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.923997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-log-socket\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.923997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-systemd\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.923997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-kubelet\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.923979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-run-netns\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-cni-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-log-socket\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-hostroot\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-cni-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924069 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-kubelet\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-conf-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-daemon-config\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-hostroot\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-tuned\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-conf-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-cni-multus\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-cni-multus\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924267 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-system-cni-dir\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-system-cni-dir\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-slash\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924380 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-env-overrides\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-etc-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-run-ovn-kubernetes\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-cnibin\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-slash\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-netns\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-var-lib-kubelet\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-run-ovn-kubernetes\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-sys-fs\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-etc-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-kube-api-access-qldm9\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-netns\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-cnibin\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-var-lib-kubelet\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-daemon-config\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.924875 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924622 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-node-log\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-node-log\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff70e5e0-9bde-4a87-af27-6726427e4ba4-cni-binary-copy\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-multus-certs\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysconfig\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysctl-conf\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tllmk\" (UniqueName: \"kubernetes.io/projected/f0075f81-88ff-4518-bd4e-bc50656a593b-kube-api-access-tllmk\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysconfig\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-k8s-cni-cncf-io\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-host\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-ovn\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0075f81-88ff-4518-bd4e-bc50656a593b-ovn-node-metrics-cert\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e7e38ded-7b99-4b86-9ba3-f0cdd8e37344-konnectivity-ca\") pod \"konnectivity-agent-clpwx\" (UID: \"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344\") " pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924979 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysctl-conf\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.924993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-iptables-alerter-script\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-host-slash\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:19.925673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925023 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-ovn\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-k8s-cni-cncf-io\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cni-binary-copy\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925101 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-host\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-os-release\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-cni-bin\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-modprobe-d\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-os-release\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-lib-modules\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-ovnkube-config\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-device-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-run\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cnibin\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbr4\" (UniqueName: \"kubernetes.io/projected/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-kube-api-access-vzbr4\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-serviceca\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kd2z\" (UniqueName: \"kubernetes.io/projected/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-kube-api-access-2kd2z\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:19.926459 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-hosts-file\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-tmp-dir\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22qtd\" (UniqueName: \"kubernetes.io/projected/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-kube-api-access-22qtd\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlm4\" (UniqueName: \"kubernetes.io/projected/428c05e5-0a0b-4a8c-8239-83701e670fe3-kube-api-access-7mlm4\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-kubelet\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-systemd-units\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-systemd\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e7e38ded-7b99-4b86-9ba3-f0cdd8e37344-konnectivity-ca\") pod \"konnectivity-agent-clpwx\" (UID: \"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344\") " pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-socket-dir-parent\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-var-lib-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925789 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-var-lib-cni-bin\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-cni-netd\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-kubelet\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927202 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-tmp-dir\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-modprobe-d\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-hosts-file\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-systemd-units\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-systemd\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-lib-modules\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cni-binary-copy\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-multus-socket-dir-parent\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-run\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff70e5e0-9bde-4a87-af27-6726427e4ba4-cni-binary-copy\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-var-lib-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-ovnkube-script-lib\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-etc-selinux\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cnibin\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/428c05e5-0a0b-4a8c-8239-83701e670fe3-tmp\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-cni-netd\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.927822 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.925836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-run-openvswitch\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-cni-bin\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-registration-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-ovnkube-config\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-etc-kubernetes\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysctl-d\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-system-cni-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e7e38ded-7b99-4b86-9ba3-f0cdd8e37344-agent-certs\") pod \"konnectivity-agent-clpwx\" (UID: \"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344\") " pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfsl\" (UniqueName: \"kubernetes.io/projected/ff70e5e0-9bde-4a87-af27-6726427e4ba4-kube-api-access-4kfsl\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-sys\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-os-release\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0075f81-88ff-4518-bd4e-bc50656a593b-host-cni-bin\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdt9w\" (UniqueName: \"kubernetes.io/projected/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-kube-api-access-vdt9w\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5gfw\" (UniqueName: \"kubernetes.io/projected/f269b2bc-3b99-4857-9296-e9d0485a3bde-kube-api-access-v5gfw\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:19.928626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-kubernetes\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-host\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-serviceca\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-etc-kubernetes\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-kubernetes\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-host\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0075f81-88ff-4518-bd4e-bc50656a593b-ovnkube-script-lib\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-system-cni-dir\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-sys\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-sysctl-d\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.926981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-os-release\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.927332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff70e5e0-9bde-4a87-af27-6726427e4ba4-host-run-multus-certs\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.927408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.928202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0075f81-88ff-4518-bd4e-bc50656a593b-ovn-node-metrics-cert\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.928381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/428c05e5-0a0b-4a8c-8239-83701e670fe3-etc-tuned\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.929451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.929409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/428c05e5-0a0b-4a8c-8239-83701e670fe3-tmp\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.930200 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.929738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e7e38ded-7b99-4b86-9ba3-f0cdd8e37344-agent-certs\") pod \"konnectivity-agent-clpwx\" (UID: \"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344\") " pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:19.938029 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.937849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlm4\" (UniqueName: \"kubernetes.io/projected/428c05e5-0a0b-4a8c-8239-83701e670fe3-kube-api-access-7mlm4\") pod \"tuned-9nxzl\" (UID: \"428c05e5-0a0b-4a8c-8239-83701e670fe3\") " pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:19.938029 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.937935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qtd\" (UniqueName: \"kubernetes.io/projected/8528bd48-4f37-4fef-bd6e-7df9d6a2773f-kube-api-access-22qtd\") pod \"node-resolver-5q2mv\" (UID: \"8528bd48-4f37-4fef-bd6e-7df9d6a2773f\") " pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:19.938029 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.937986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfsl\" (UniqueName: \"kubernetes.io/projected/ff70e5e0-9bde-4a87-af27-6726427e4ba4-kube-api-access-4kfsl\") pod \"multus-kxp4l\" (UID: \"ff70e5e0-9bde-4a87-af27-6726427e4ba4\") " pod="openshift-multus/multus-kxp4l" Apr 17 16:31:19.938229 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.938145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tllmk\" (UniqueName: \"kubernetes.io/projected/f0075f81-88ff-4518-bd4e-bc50656a593b-kube-api-access-tllmk\") pod \"ovnkube-node-vj4bz\" (UID: \"f0075f81-88ff-4518-bd4e-bc50656a593b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:19.938229 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.938159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdt9w\" (UniqueName: \"kubernetes.io/projected/2685d399-ce45-4aec-bf5c-ce5d17cb16f4-kube-api-access-vdt9w\") pod \"node-ca-gmh4h\" (UID: \"2685d399-ce45-4aec-bf5c-ce5d17cb16f4\") " pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:19.939204 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.939187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbr4\" (UniqueName: \"kubernetes.io/projected/759006e8-ec0f-48e4-bf68-b87d7dcbf08e-kube-api-access-vzbr4\") pod \"multus-additional-cni-plugins-mnjjm\" (UID: \"759006e8-ec0f-48e4-bf68-b87d7dcbf08e\") " pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:19.972907 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.972867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" event={"ID":"7e3c596a27faede1f97b6bb0972592f6","Type":"ContainerStarted","Data":"ca3de6d38b93b6c399871987b2853f5573e3834d414e34567593e4bec6b3fc86"} Apr 17 16:31:19.973887 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:19.973858 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" event={"ID":"50b2c37a28961f1c8aacb6ad5db58d22","Type":"ContainerStarted","Data":"1f86ed160e0181393b131a1660b378c043fa4c43b768c748a06d578c2d18b76e"} Apr 17 16:31:20.027443 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5gfw\" (UniqueName: \"kubernetes.io/projected/f269b2bc-3b99-4857-9296-e9d0485a3bde-kube-api-access-v5gfw\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.027613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:20.027613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-socket-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.027613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-sys-fs\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.027613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-sys-fs\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.027613 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.027594 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.027833 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.027687 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.527654027 +0000 UTC m=+3.115242998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.027833 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-socket-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.027833 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-kube-api-access-qldm9\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:20.027833 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-iptables-alerter-script\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.027833 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-host-slash\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-host-slash\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-device-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kd2z\" (UniqueName: \"kubernetes.io/projected/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-kube-api-access-2kd2z\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-etc-selinux\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-device-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.027983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-registration-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.028006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.028036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-registration-dir\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028500 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.028108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f269b2bc-3b99-4857-9296-e9d0485a3bde-etc-selinux\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.028500 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.028225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-iptables-alerter-script\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.035038 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.035016 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:20.035038 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.035039 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:20.035401 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.035063 2572 projected.go:194] Error preparing data for projected volume kube-api-access-b8m6s for pod openshift-network-diagnostics/network-check-target-xrrkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.035401 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.035129 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s podName:e90a9ccd-0623-4495-a588-60c702965b82 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.535112569 +0000 UTC m=+3.122701537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b8m6s" (UniqueName: "kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s") pod "network-check-target-xrrkr" (UID: "e90a9ccd-0623-4495-a588-60c702965b82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.037345 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.037323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kd2z\" (UniqueName: \"kubernetes.io/projected/e0dcce32-f52f-45ec-b12d-45f981a5e5bf-kube-api-access-2kd2z\") pod \"iptables-alerter-qg47l\" (UID: \"e0dcce32-f52f-45ec-b12d-45f981a5e5bf\") " pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.037594 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.037576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5gfw\" (UniqueName: \"kubernetes.io/projected/f269b2bc-3b99-4857-9296-e9d0485a3bde-kube-api-access-v5gfw\") pod \"aws-ebs-csi-driver-node-lxg4x\" (UID: \"f269b2bc-3b99-4857-9296-e9d0485a3bde\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.037657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.037643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldm9\" (UniqueName: \"kubernetes.io/projected/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-kube-api-access-qldm9\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:20.120451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.120375 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" Apr 17 16:31:20.128313 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.128288 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kxp4l" Apr 17 16:31:20.135988 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.135967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:20.143579 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.143562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gmh4h" Apr 17 16:31:20.151113 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.151087 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5q2mv" Apr 17 16:31:20.160613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.160596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:20.166195 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.166164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" Apr 17 16:31:20.173698 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.173680 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" Apr 17 16:31:20.181206 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.181189 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qg47l" Apr 17 16:31:20.530387 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.530320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:20.530517 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.530423 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.530517 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.530480 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.530466814 +0000 UTC m=+4.118055762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.631142 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.631083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:20.631304 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.631277 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:20.631357 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.631305 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:20.631357 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.631319 2572 projected.go:194] Error preparing data for projected volume kube-api-access-b8m6s for pod openshift-network-diagnostics/network-check-target-xrrkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.631436 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:20.631386 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s podName:e90a9ccd-0623-4495-a588-60c702965b82 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.63137166 +0000 UTC m=+4.218960607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8m6s" (UniqueName: "kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s") pod "network-check-target-xrrkr" (UID: "e90a9ccd-0623-4495-a588-60c702965b82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.673823 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:20.673755 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff70e5e0_9bde_4a87_af27_6726427e4ba4.slice/crio-df52b55ea25eda4db68a125247c8c004b519ae4b92a20717891c88424b4e1a3f WatchSource:0}: Error finding container df52b55ea25eda4db68a125247c8c004b519ae4b92a20717891c88424b4e1a3f: Status 404 returned error can't find the container with id df52b55ea25eda4db68a125247c8c004b519ae4b92a20717891c88424b4e1a3f Apr 17 16:31:20.675376 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:20.674988 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2685d399_ce45_4aec_bf5c_ce5d17cb16f4.slice/crio-51b7eb53878aa267cbb5a05ae846a1da886ed2a95c20fbc0fe395f4a08a4cfa5 WatchSource:0}: Error finding container 51b7eb53878aa267cbb5a05ae846a1da886ed2a95c20fbc0fe395f4a08a4cfa5: Status 404 returned error can't find the container with id 51b7eb53878aa267cbb5a05ae846a1da886ed2a95c20fbc0fe395f4a08a4cfa5 Apr 17 16:31:20.676358 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:20.676333 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428c05e5_0a0b_4a8c_8239_83701e670fe3.slice/crio-616ce7a098628c36fd7092507422d343f1b039d9362227fc206599dd38682ae6 WatchSource:0}: Error finding container 616ce7a098628c36fd7092507422d343f1b039d9362227fc206599dd38682ae6: Status 404 returned error can't find the container with id 616ce7a098628c36fd7092507422d343f1b039d9362227fc206599dd38682ae6 Apr 17 16:31:20.680295 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:20.680270 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf269b2bc_3b99_4857_9296_e9d0485a3bde.slice/crio-501662766f6bb274478546f4b9e2c76e8ac527b7c78e24cda0adaec12e870230 WatchSource:0}: Error finding container 501662766f6bb274478546f4b9e2c76e8ac527b7c78e24cda0adaec12e870230: Status 404 returned error can't find the container with id 501662766f6bb274478546f4b9e2c76e8ac527b7c78e24cda0adaec12e870230 Apr 17 16:31:20.680874 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:20.680821 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0dcce32_f52f_45ec_b12d_45f981a5e5bf.slice/crio-46f8b6414ab5bf94c8be51e896bd2df0b539162011b42649418a53b30deb2ebd WatchSource:0}: Error finding container 46f8b6414ab5bf94c8be51e896bd2df0b539162011b42649418a53b30deb2ebd: Status 404 returned error can't find the container with id 46f8b6414ab5bf94c8be51e896bd2df0b539162011b42649418a53b30deb2ebd Apr 17 16:31:20.682239 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:20.682181 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0075f81_88ff_4518_bd4e_bc50656a593b.slice/crio-1a0bb9a1130fa0b26e816182dfce9c2a4396330227c21ab635bb1f7483367bee WatchSource:0}: Error finding container 1a0bb9a1130fa0b26e816182dfce9c2a4396330227c21ab635bb1f7483367bee: Status 404 returned error can't find the container with id 1a0bb9a1130fa0b26e816182dfce9c2a4396330227c21ab635bb1f7483367bee Apr 17 16:31:20.862140 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.861945 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:18 +0000 UTC" deadline="2027-09-13 07:04:00.514239346 +0000 UTC" Apr 17 16:31:20.862140 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.862097 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12326h32m39.652144381s" Apr 17 16:31:20.978630 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.978590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" event={"ID":"50b2c37a28961f1c8aacb6ad5db58d22","Type":"ContainerStarted","Data":"734bb9108febdf5157841522e1dacc133d07b2b5cd19d0d436b485a5c6c36486"} Apr 17 16:31:20.979867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.979835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerStarted","Data":"f6d4072624f0586cc5c4f5d5ec786e527a25a05dfa710b6337b18b5180df4d10"} Apr 17 16:31:20.981114 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.981094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5q2mv" event={"ID":"8528bd48-4f37-4fef-bd6e-7df9d6a2773f","Type":"ContainerStarted","Data":"416aed6df3c7121fe9bf69bb75f85861d4e9e38703a2aae3404bd4492f18d8cf"} Apr 17 16:31:20.982827 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.982805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" event={"ID":"f269b2bc-3b99-4857-9296-e9d0485a3bde","Type":"ContainerStarted","Data":"501662766f6bb274478546f4b9e2c76e8ac527b7c78e24cda0adaec12e870230"} Apr 17 16:31:20.983929 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.983904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gmh4h" event={"ID":"2685d399-ce45-4aec-bf5c-ce5d17cb16f4","Type":"ContainerStarted","Data":"51b7eb53878aa267cbb5a05ae846a1da886ed2a95c20fbc0fe395f4a08a4cfa5"} Apr 17 16:31:20.984933 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.984899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qg47l" event={"ID":"e0dcce32-f52f-45ec-b12d-45f981a5e5bf","Type":"ContainerStarted","Data":"46f8b6414ab5bf94c8be51e896bd2df0b539162011b42649418a53b30deb2ebd"} Apr 17 16:31:20.985777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.985755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-clpwx" event={"ID":"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344","Type":"ContainerStarted","Data":"4f235923751e6fbdde93b239b560b8518c1312ef8f24eb5d4b3a3e3f04271f76"} Apr 17 16:31:20.988143 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.988120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"1a0bb9a1130fa0b26e816182dfce9c2a4396330227c21ab635bb1f7483367bee"} Apr 17 16:31:20.989570 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.989549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" event={"ID":"428c05e5-0a0b-4a8c-8239-83701e670fe3","Type":"ContainerStarted","Data":"616ce7a098628c36fd7092507422d343f1b039d9362227fc206599dd38682ae6"} Apr 17 16:31:20.991059 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.991041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kxp4l" event={"ID":"ff70e5e0-9bde-4a87-af27-6726427e4ba4","Type":"ContainerStarted","Data":"df52b55ea25eda4db68a125247c8c004b519ae4b92a20717891c88424b4e1a3f"} Apr 17 16:31:20.992562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:20.992527 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-177.ec2.internal" podStartSLOduration=1.992515536 podStartE2EDuration="1.992515536s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:20.992005692 +0000 UTC m=+3.579594664" watchObservedRunningTime="2026-04-17 16:31:20.992515536 +0000 UTC m=+3.580104563" Apr 17 16:31:21.365494 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.365463 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rxw7n"] Apr 17 16:31:21.367274 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.367251 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.367391 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.367325 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:21.438196 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.437972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6644c140-1be7-4a25-becc-56acd5a0f042-dbus\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.438196 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.438065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.438196 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.438117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6644c140-1be7-4a25-becc-56acd5a0f042-kubelet-config\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.539053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.539100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6644c140-1be7-4a25-becc-56acd5a0f042-kubelet-config\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.539141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6644c140-1be7-4a25-becc-56acd5a0f042-dbus\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.539192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.539307 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.539378 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret podName:6644c140-1be7-4a25-becc-56acd5a0f042 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:22.039350738 +0000 UTC m=+4.626939707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret") pod "global-pull-secret-syncer-rxw7n" (UID: "6644c140-1be7-4a25-becc-56acd5a0f042") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.539644 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.539689 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:23.539674793 +0000 UTC m=+6.127263744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.539767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6644c140-1be7-4a25-becc-56acd5a0f042-kubelet-config\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.539930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.539891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6644c140-1be7-4a25-becc-56acd5a0f042-dbus\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:21.640339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.640262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:21.640490 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.640433 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:21.640490 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.640467 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:21.640490 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.640482 2572 projected.go:194] Error preparing data for projected volume kube-api-access-b8m6s for pod openshift-network-diagnostics/network-check-target-xrrkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:21.640655 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.640537 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s podName:e90a9ccd-0623-4495-a588-60c702965b82 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:23.640518369 +0000 UTC m=+6.228107321 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8m6s" (UniqueName: "kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s") pod "network-check-target-xrrkr" (UID: "e90a9ccd-0623-4495-a588-60c702965b82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:21.970222 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.969876 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:21.970222 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:21.969921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:21.970222 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.970020 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:21.970222 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:21.970109 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:22.010737 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:22.009651 2572 generic.go:358] "Generic (PLEG): container finished" podID="7e3c596a27faede1f97b6bb0972592f6" containerID="9d0152591154e536161a2a03877478758db1e23e978b42f99ebcb177566cfb83" exitCode=0 Apr 17 16:31:22.010737 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:22.010512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" event={"ID":"7e3c596a27faede1f97b6bb0972592f6","Type":"ContainerDied","Data":"9d0152591154e536161a2a03877478758db1e23e978b42f99ebcb177566cfb83"} Apr 17 16:31:22.044557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:22.044500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:22.044687 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:22.044635 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:22.044765 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:22.044698 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret podName:6644c140-1be7-4a25-becc-56acd5a0f042 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:23.044679065 +0000 UTC m=+5.632268037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret") pod "global-pull-secret-syncer-rxw7n" (UID: "6644c140-1be7-4a25-becc-56acd5a0f042") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:22.969162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:22.968659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:22.969162 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:22.968831 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:23.021512 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.021477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" event={"ID":"7e3c596a27faede1f97b6bb0972592f6","Type":"ContainerStarted","Data":"2a19325761c30e25493dd3a8955d8f3e6f15860a59cc6ad42bf87dc682c555fd"} Apr 17 16:31:23.038553 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.038500 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-177.ec2.internal" podStartSLOduration=4.038480195 podStartE2EDuration="4.038480195s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:23.037531081 +0000 UTC m=+5.625120056" watchObservedRunningTime="2026-04-17 16:31:23.038480195 +0000 UTC m=+5.626069165" Apr 17 16:31:23.054168 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.054139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:23.054334 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.054311 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:23.054408 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.054372 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret podName:6644c140-1be7-4a25-becc-56acd5a0f042 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:25.054354667 +0000 UTC m=+7.641943628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret") pod "global-pull-secret-syncer-rxw7n" (UID: "6644c140-1be7-4a25-becc-56acd5a0f042") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:23.559144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.558564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:23.559144 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.558732 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:23.559144 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.558795 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:27.558776269 +0000 UTC m=+10.146365220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:23.660461 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.659769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:23.660461 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.659995 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:23.660461 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.660018 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:23.660461 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.660030 2572 projected.go:194] Error preparing data for projected volume kube-api-access-b8m6s for pod openshift-network-diagnostics/network-check-target-xrrkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.660461 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.660100 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s podName:e90a9ccd-0623-4495-a588-60c702965b82 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:27.660070756 +0000 UTC m=+10.247659720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8m6s" (UniqueName: "kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s") pod "network-check-target-xrrkr" (UID: "e90a9ccd-0623-4495-a588-60c702965b82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.970439 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.970352 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:23.970596 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.970474 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:23.970596 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:23.970558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:23.970705 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:23.970664 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:24.968889 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:24.968853 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:24.969359 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:24.968973 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:25.072936 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:25.072900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:25.073127 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:25.073105 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:25.073193 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:25.073179 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret podName:6644c140-1be7-4a25-becc-56acd5a0f042 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:29.073159144 +0000 UTC m=+11.660748093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret") pod "global-pull-secret-syncer-rxw7n" (UID: "6644c140-1be7-4a25-becc-56acd5a0f042") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:25.969894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:25.969864 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:25.970333 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:25.969912 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:25.970333 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:25.970021 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:25.970333 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:25.970128 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:26.969093 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:26.969062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:26.969283 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:26.969194 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:27.595152 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:27.595118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:27.595585 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.595246 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:27.595585 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.595316 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:35.595294891 +0000 UTC m=+18.182883842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:27.696233 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:27.696191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:27.696408 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.696372 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:27.696408 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.696396 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:27.696408 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.696408 2572 projected.go:194] Error preparing data for projected volume kube-api-access-b8m6s for pod openshift-network-diagnostics/network-check-target-xrrkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:27.696563 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.696461 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s podName:e90a9ccd-0623-4495-a588-60c702965b82 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:35.696442775 +0000 UTC m=+18.284031725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8m6s" (UniqueName: "kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s") pod "network-check-target-xrrkr" (UID: "e90a9ccd-0623-4495-a588-60c702965b82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:27.970953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:27.970819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:27.971112 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.970956 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:27.971112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:27.971030 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:27.972053 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:27.972015 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:28.968910 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:28.968873 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:28.969360 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:28.969070 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:29.109005 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:29.108968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:29.109190 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:29.109162 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:29.109271 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:29.109243 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret podName:6644c140-1be7-4a25-becc-56acd5a0f042 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:37.109214712 +0000 UTC m=+19.696803677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret") pod "global-pull-secret-syncer-rxw7n" (UID: "6644c140-1be7-4a25-becc-56acd5a0f042") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:29.969571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:29.969487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:29.969999 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:29.969487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:29.969999 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:29.969630 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:29.969999 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:29.969666 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:30.969363 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:30.969333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:30.969533 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:30.969454 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:31.968970 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:31.968935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:31.969362 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:31.968982 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:31.969362 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:31.969088 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:31.969362 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:31.969180 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:32.969504 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:32.969476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:32.969948 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:32.969578 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:33.969160 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:33.969126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:33.969333 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:33.969126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:33.969333 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:33.969241 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:33.969333 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:33.969294 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:34.968887 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:34.968849 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:34.969342 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:34.968984 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:35.658027 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:35.657990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:35.658207 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.658153 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:35.658258 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.658225 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.658209495 +0000 UTC m=+34.245798448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:35.758874 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:35.758842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:35.759043 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.759010 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:35.759043 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.759028 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:35.759043 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.759042 2572 projected.go:194] Error preparing data for projected volume kube-api-access-b8m6s for pod openshift-network-diagnostics/network-check-target-xrrkr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:35.759194 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.759102 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s podName:e90a9ccd-0623-4495-a588-60c702965b82 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.759082655 +0000 UTC m=+34.346671618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b8m6s" (UniqueName: "kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s") pod "network-check-target-xrrkr" (UID: "e90a9ccd-0623-4495-a588-60c702965b82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:35.968899 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:35.968822 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:35.969101 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:35.968822 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:35.969101 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.968939 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:35.969101 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:35.969016 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:36.969175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:36.969144 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:36.969636 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:36.969262 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:37.169411 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:37.169368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:37.169578 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:37.169517 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:37.169652 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:37.169600 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret podName:6644c140-1be7-4a25-becc-56acd5a0f042 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.169578472 +0000 UTC m=+35.757167422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret") pod "global-pull-secret-syncer-rxw7n" (UID: "6644c140-1be7-4a25-becc-56acd5a0f042") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:37.969948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:37.969926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:37.970595 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:37.970014 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:37.970595 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:37.970092 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:37.970595 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:37.970174 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:38.046921 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.046867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5q2mv" event={"ID":"8528bd48-4f37-4fef-bd6e-7df9d6a2773f","Type":"ContainerStarted","Data":"555a2946876bd9d57dd084afc57e5daf1bfcf990b9e4aa8dda85e882498ac65c"} Apr 17 16:31:38.050560 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.050507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" event={"ID":"f269b2bc-3b99-4857-9296-e9d0485a3bde","Type":"ContainerStarted","Data":"8f05f496319626e21f904a1b42c5a044b2a137169b59632c5a416a9f1fb682c7"} Apr 17 16:31:38.052272 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.052232 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gmh4h" event={"ID":"2685d399-ce45-4aec-bf5c-ce5d17cb16f4","Type":"ContainerStarted","Data":"abdbf6003139adc8d10bf6082a00c843486fe1d413a8ffd13343b12fd3837ebc"} Apr 17 16:31:38.053472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.053440 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-clpwx" event={"ID":"e7e38ded-7b99-4b86-9ba3-f0cdd8e37344","Type":"ContainerStarted","Data":"02e43b24bf036bf7680a1bcdc9abb0401dc654603312e2aec9188be3028d280e"} Apr 17 16:31:38.055347 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.055325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"a59a467f00512173635a9fa3e26309ae9a394f9d58624a684b4e496563cd61f3"} Apr 17 16:31:38.055461 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.055353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"687306a199bdd9a802fb1a56dfe08f9b250d95a67f8421b7ba8e521f08b0c47f"} Apr 17 16:31:38.056789 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.056764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" event={"ID":"428c05e5-0a0b-4a8c-8239-83701e670fe3","Type":"ContainerStarted","Data":"11f67d0461a23cfca0a1e0ef967b5b7a105089f1223cda8697aee862db256607"} Apr 17 16:31:38.059456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.059428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kxp4l" event={"ID":"ff70e5e0-9bde-4a87-af27-6726427e4ba4","Type":"ContainerStarted","Data":"e2465f906ae11109e2e0b4a7c80bb54cc9632b2c9773fc6a060591f3a3832d53"} Apr 17 16:31:38.061531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.061156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerStarted","Data":"bccb2f64e7d4e672839cdad86c9234667a99656ff7aa23170e06590450a6a6bb"} Apr 17 16:31:38.063602 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.063555 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5q2mv" podStartSLOduration=3.210424874 podStartE2EDuration="20.063539913s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.684970617 +0000 UTC m=+3.272559579" lastFinishedPulling="2026-04-17 16:31:37.538085651 +0000 UTC m=+20.125674618" observedRunningTime="2026-04-17 16:31:38.063355754 +0000 UTC m=+20.650944724" watchObservedRunningTime="2026-04-17 16:31:38.063539913 +0000 UTC m=+20.651128897" Apr 17 16:31:38.082998 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.082829 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kxp4l" podStartSLOduration=4.179303199 podStartE2EDuration="21.08281415s" podCreationTimestamp="2026-04-17 16:31:17 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.675463259 +0000 UTC m=+3.263052206" lastFinishedPulling="2026-04-17 16:31:37.578974207 +0000 UTC m=+20.166563157" observedRunningTime="2026-04-17 16:31:38.082549868 +0000 UTC m=+20.670138933" watchObservedRunningTime="2026-04-17 16:31:38.08281415 +0000 UTC m=+20.670403121" Apr 17 16:31:38.119161 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.119119 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gmh4h" podStartSLOduration=3.2574593849999998 podStartE2EDuration="20.119104508s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.676963915 +0000 UTC m=+3.264552873" lastFinishedPulling="2026-04-17 16:31:37.538609045 +0000 UTC m=+20.126197996" observedRunningTime="2026-04-17 16:31:38.118877087 +0000 UTC m=+20.706466057" watchObservedRunningTime="2026-04-17 16:31:38.119104508 +0000 UTC m=+20.706693527" Apr 17 16:31:38.137612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.137564 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-clpwx" podStartSLOduration=11.107913325 podStartE2EDuration="20.137545188s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.68636604 +0000 UTC m=+3.273954995" lastFinishedPulling="2026-04-17 16:31:29.715997908 +0000 UTC m=+12.303586858" observedRunningTime="2026-04-17 16:31:38.137283282 +0000 UTC m=+20.724872253" watchObservedRunningTime="2026-04-17 16:31:38.137545188 +0000 UTC m=+20.725134145" Apr 17 16:31:38.157664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.157618 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9nxzl" podStartSLOduration=3.29667378 podStartE2EDuration="20.157605415s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.679246615 +0000 UTC m=+3.266835566" lastFinishedPulling="2026-04-17 16:31:37.540178247 +0000 UTC m=+20.127767201" observedRunningTime="2026-04-17 16:31:38.156813708 +0000 UTC m=+20.744402677" watchObservedRunningTime="2026-04-17 16:31:38.157605415 +0000 UTC m=+20.745194384" Apr 17 16:31:38.968969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:38.968944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:38.969070 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:38.969032 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:39.009663 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.009623 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:39.063999 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.063968 2572 generic.go:358] "Generic (PLEG): container finished" podID="759006e8-ec0f-48e4-bf68-b87d7dcbf08e" containerID="bccb2f64e7d4e672839cdad86c9234667a99656ff7aa23170e06590450a6a6bb" exitCode=0 Apr 17 16:31:39.064093 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.064047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerDied","Data":"bccb2f64e7d4e672839cdad86c9234667a99656ff7aa23170e06590450a6a6bb"} Apr 17 16:31:39.065599 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.065574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" event={"ID":"f269b2bc-3b99-4857-9296-e9d0485a3bde","Type":"ContainerStarted","Data":"8f0279bb51c76c5b627286a08044cef441e5240227d9f349d6c8c8bcb6dadac7"} Apr 17 16:31:39.066924 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.066905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qg47l" event={"ID":"e0dcce32-f52f-45ec-b12d-45f981a5e5bf","Type":"ContainerStarted","Data":"127749b0104838f14baa96f4c6b409763cd03006d89eed6c4134e31f6a7ff41a"} Apr 17 16:31:39.069155 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:31:39.069429 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069412 2572 generic.go:358] "Generic (PLEG): container finished" podID="f0075f81-88ff-4518-bd4e-bc50656a593b" containerID="a59a467f00512173635a9fa3e26309ae9a394f9d58624a684b4e496563cd61f3" exitCode=1 Apr 17 16:31:39.069535 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerDied","Data":"a59a467f00512173635a9fa3e26309ae9a394f9d58624a684b4e496563cd61f3"} Apr 17 16:31:39.069535 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"61e9bb319342528ddc36e022d9514668fd804e57ca5aa35811e2a5f01867d203"} Apr 17 16:31:39.069664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"977e5514efe937b5b4abd51f134c857a83fc62c6f8605d0f5e40e5ea8709fa68"} Apr 17 16:31:39.069664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"73aa27b54b8bd53bd6fcf19bd8176a9061d8564764675489c3417593e35ec621"} Apr 17 16:31:39.069664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.069612 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"b93a1a535f20cd6a6488bf050c32b2401567b6ec77cc143572ad3bdf8291e4f1"} Apr 17 16:31:39.102051 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.102013 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qg47l" podStartSLOduration=4.246077 podStartE2EDuration="21.102001564s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.682589951 +0000 UTC m=+3.270178915" lastFinishedPulling="2026-04-17 16:31:37.538514528 +0000 UTC m=+20.126103479" observedRunningTime="2026-04-17 16:31:39.101743782 +0000 UTC m=+21.689332755" watchObservedRunningTime="2026-04-17 16:31:39.102001564 +0000 UTC m=+21.689590534" Apr 17 16:31:39.885593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.885484 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:39.009643593Z","UUID":"92f757ce-c174-4d65-a91d-33c3fd951a10","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:39.887314 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.887288 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:39.887430 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.887322 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:39.969498 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.969473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:39.969635 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:39.969593 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:39.969763 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:39.969650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:39.969820 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:39.969774 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:40.348546 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:40.348363 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:40.568655 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:40.568627 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:40.569255 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:40.569236 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:40.969357 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:40.969286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:40.969512 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:40.969394 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:41.074738 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.074682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" event={"ID":"f269b2bc-3b99-4857-9296-e9d0485a3bde","Type":"ContainerStarted","Data":"48079131424a63919c219989af2f9ec0f83984c7fa3c455689aa858239327613"} Apr 17 16:31:41.078057 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.078031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:31:41.078425 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.078401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"50a23f8a60e0ddee1304f8cf8bec4924d8210609341842910d9d6896a518e0c2"} Apr 17 16:31:41.079085 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.079067 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-clpwx" Apr 17 16:31:41.090490 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.090444 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lxg4x" podStartSLOduration=3.805623115 podStartE2EDuration="23.090428985s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.68200245 +0000 UTC m=+3.269591410" lastFinishedPulling="2026-04-17 16:31:39.966808319 +0000 UTC m=+22.554397280" observedRunningTime="2026-04-17 16:31:41.09041774 +0000 UTC m=+23.678006710" watchObservedRunningTime="2026-04-17 16:31:41.090428985 +0000 UTC m=+23.678017955" Apr 17 16:31:41.968992 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.968958 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:41.969456 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:41.969081 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:41.969456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:41.969132 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:41.969456 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:41.969221 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:42.969000 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:42.968925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:42.969159 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:42.969027 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:43.969174 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:43.969141 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:43.969519 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:43.969248 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:43.969519 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:43.969333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:43.969519 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:43.969431 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:44.086415 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.086389 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:31:44.086712 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.086692 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"f9238a851a065097b9c850adb0fc725055ff6b656f09fde3214cccda01ca10dd"} Apr 17 16:31:44.087076 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.087026 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:44.087076 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.087054 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:44.087227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.087174 2572 scope.go:117] "RemoveContainer" containerID="a59a467f00512173635a9fa3e26309ae9a394f9d58624a684b4e496563cd61f3" Apr 17 16:31:44.088809 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.088775 2572 generic.go:358] "Generic (PLEG): container finished" podID="759006e8-ec0f-48e4-bf68-b87d7dcbf08e" containerID="824653a5f3613f0cf415305517e78431dc81ae4568a19190088ef2bae232c2e4" exitCode=0 Apr 17 16:31:44.088890 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.088819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerDied","Data":"824653a5f3613f0cf415305517e78431dc81ae4568a19190088ef2bae232c2e4"} Apr 17 16:31:44.103345 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.103322 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:44.103464 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.103449 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:44.969094 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:44.969069 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:44.969246 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:44.969194 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:45.093849 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.093705 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:31:45.094211 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.094172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" event={"ID":"f0075f81-88ff-4518-bd4e-bc50656a593b","Type":"ContainerStarted","Data":"8ff53025c443d92cd30b99563413cddb979a8ed0f66a131187896c55c552a47c"} Apr 17 16:31:45.094306 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.094293 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:45.099293 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.099269 2572 generic.go:358] "Generic (PLEG): container finished" podID="759006e8-ec0f-48e4-bf68-b87d7dcbf08e" containerID="037c364faa964304e1579e6c78d12b86f84a310b40d848c59630267eb6f58331" exitCode=0 Apr 17 16:31:45.099394 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.099318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerDied","Data":"037c364faa964304e1579e6c78d12b86f84a310b40d848c59630267eb6f58331"} Apr 17 16:31:45.127644 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.127601 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" podStartSLOduration=10.045889438 podStartE2EDuration="27.127591222s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.684209678 +0000 UTC m=+3.271798639" lastFinishedPulling="2026-04-17 16:31:37.765911471 +0000 UTC m=+20.353500423" observedRunningTime="2026-04-17 16:31:45.125809169 +0000 UTC m=+27.713398136" watchObservedRunningTime="2026-04-17 16:31:45.127591222 +0000 UTC m=+27.715180202" Apr 17 16:31:45.222993 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.222926 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xrrkr"] Apr 17 16:31:45.223134 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.223037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:45.223134 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:45.223114 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:45.226020 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.225998 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rxw7n"] Apr 17 16:31:45.226134 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.226092 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:45.226180 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:45.226162 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:45.253603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.253578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vw79z"] Apr 17 16:31:45.253757 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:45.253741 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:45.253895 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:45.253866 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:46.102704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:46.102667 2572 generic.go:358] "Generic (PLEG): container finished" podID="759006e8-ec0f-48e4-bf68-b87d7dcbf08e" containerID="99470c32b45ab724625fb8fc59ce2f6e4644ebba779f93457ae93d01315db935" exitCode=0 Apr 17 16:31:46.103066 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:46.102750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerDied","Data":"99470c32b45ab724625fb8fc59ce2f6e4644ebba779f93457ae93d01315db935"} Apr 17 16:31:46.103066 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:46.102955 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:46.969305 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:46.969272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:46.969305 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:46.969296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:46.969504 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:46.969286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:46.969504 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:46.969390 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:46.969504 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:46.969469 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:46.969631 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:46.969545 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:47.024665 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:47.024638 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:31:48.968757 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:48.968704 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:48.969347 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:48.968735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:48.969347 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:48.968836 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:31:48.969347 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:48.968860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:48.969347 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:48.968971 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rxw7n" podUID="6644c140-1be7-4a25-becc-56acd5a0f042" Apr 17 16:31:48.969347 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:48.969076 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vw79z" podUID="36f3412d-e266-4f24-8ea6-1f3d3cdd2546" Apr 17 16:31:50.716268 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.716238 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-177.ec2.internal" event="NodeReady" Apr 17 16:31:50.716739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.716381 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:50.769398 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.769366 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4nqbv"] Apr 17 16:31:50.773799 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.773779 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.776387 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.776368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-78gjs\"" Apr 17 16:31:50.776815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.776789 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:50.777540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.777521 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gvlnm"] Apr 17 16:31:50.778766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.778583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:50.780322 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.780303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:50.782945 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.782924 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:50.783198 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.783173 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:50.783318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.783242 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:50.783422 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.783395 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qsxk4\"" Apr 17 16:31:50.784501 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.784482 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4nqbv"] Apr 17 16:31:50.792028 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.792010 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gvlnm"] Apr 17 16:31:50.882431 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.882388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:50.882431 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.882435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.882668 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.882492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d470685-9573-40d7-b32c-929ed88cc56d-config-volume\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.882668 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.882608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqb2p\" (UniqueName: \"kubernetes.io/projected/7d470685-9573-40d7-b32c-929ed88cc56d-kube-api-access-gqb2p\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.882668 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.882648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d470685-9573-40d7-b32c-929ed88cc56d-tmp-dir\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.882827 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.882673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2l52\" (UniqueName: \"kubernetes.io/projected/d60e97bd-f20c-497d-ae2a-6dac86b93c77-kube-api-access-s2l52\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:50.968793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.968697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:50.968948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.968697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:50.968948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.968703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:50.971362 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.971337 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:50.971471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.971366 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:50.971471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.971382 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:50.971471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.971461 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:31:50.971471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.971348 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tvdp2\"" Apr 17 16:31:50.971654 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.971597 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qkztw\"" Apr 17 16:31:50.983002 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.982977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d470685-9573-40d7-b32c-929ed88cc56d-config-volume\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.983645 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.983621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqb2p\" (UniqueName: \"kubernetes.io/projected/7d470685-9573-40d7-b32c-929ed88cc56d-kube-api-access-gqb2p\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.983759 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.983687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d470685-9573-40d7-b32c-929ed88cc56d-tmp-dir\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.983759 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.983729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2l52\" (UniqueName: \"kubernetes.io/projected/d60e97bd-f20c-497d-ae2a-6dac86b93c77-kube-api-access-s2l52\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:50.983881 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.983784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:50.983881 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.983813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.983988 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:50.983912 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:50.983988 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:50.983971 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.483948752 +0000 UTC m=+34.071537701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:31:50.984098 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.984020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d470685-9573-40d7-b32c-929ed88cc56d-tmp-dir\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.984098 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.984069 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d470685-9573-40d7-b32c-929ed88cc56d-config-volume\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.984209 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:50.984100 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:50.984209 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:50.984150 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.484136216 +0000 UTC m=+34.071725168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:31:50.994430 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.994412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqb2p\" (UniqueName: \"kubernetes.io/projected/7d470685-9573-40d7-b32c-929ed88cc56d-kube-api-access-gqb2p\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:50.994618 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:50.994593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2l52\" (UniqueName: \"kubernetes.io/projected/d60e97bd-f20c-497d-ae2a-6dac86b93c77-kube-api-access-s2l52\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:51.487832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:51.487791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:51.487832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:51.487841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:51.488073 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:51.487944 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:51.488073 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:51.487953 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:51.488073 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:51.488006 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:52.487989008 +0000 UTC m=+35.075577982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:31:51.488222 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:51.488101 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:52.48808047 +0000 UTC m=+35.075669433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:31:51.689533 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:51.689493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:31:51.689670 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:51.689640 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:31:51.689739 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:51.689705 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:23.689690637 +0000 UTC m=+66.277279586 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : secret "metrics-daemon-secret" not found Apr 17 16:31:51.789959 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:51.789893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:51.792420 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:51.792402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8m6s\" (UniqueName: \"kubernetes.io/projected/e90a9ccd-0623-4495-a588-60c702965b82-kube-api-access-b8m6s\") pod \"network-check-target-xrrkr\" (UID: \"e90a9ccd-0623-4495-a588-60c702965b82\") " pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:51.886820 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:51.886792 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:31:52.195315 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:52.195291 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xrrkr"] Apr 17 16:31:52.198957 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:52.198928 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90a9ccd_0623_4495_a588_60c702965b82.slice/crio-c63b038badb72d8938a7ef15e80b3d3ea92f3df6e62a94d14dbea0567a661fe1 WatchSource:0}: Error finding container c63b038badb72d8938a7ef15e80b3d3ea92f3df6e62a94d14dbea0567a661fe1: Status 404 returned error can't find the container with id c63b038badb72d8938a7ef15e80b3d3ea92f3df6e62a94d14dbea0567a661fe1 Apr 17 16:31:52.494634 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:52.494600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:52.494827 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:52.494645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:52.494827 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:52.494781 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:52.494827 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:52.494784 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:52.494954 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:52.494849 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.494835276 +0000 UTC m=+37.082424224 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:31:52.494954 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:52.494862 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.494856989 +0000 UTC m=+37.082445937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:31:53.117928 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.117893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xrrkr" event={"ID":"e90a9ccd-0623-4495-a588-60c702965b82","Type":"ContainerStarted","Data":"c63b038badb72d8938a7ef15e80b3d3ea92f3df6e62a94d14dbea0567a661fe1"} Apr 17 16:31:53.120335 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.120311 2572 generic.go:358] "Generic (PLEG): container finished" podID="759006e8-ec0f-48e4-bf68-b87d7dcbf08e" containerID="53eb561995a022c42b3612bde334e4a7469fc93e146f3ed24188e2394c152791" exitCode=0 Apr 17 16:31:53.120461 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.120351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerDied","Data":"53eb561995a022c42b3612bde334e4a7469fc93e146f3ed24188e2394c152791"} Apr 17 16:31:53.199852 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.199814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:53.204052 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.204026 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6644c140-1be7-4a25-becc-56acd5a0f042-original-pull-secret\") pod \"global-pull-secret-syncer-rxw7n\" (UID: \"6644c140-1be7-4a25-becc-56acd5a0f042\") " pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:53.380447 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.380384 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rxw7n" Apr 17 16:31:53.496000 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:53.495850 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rxw7n"] Apr 17 16:31:53.499795 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:31:53.499769 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6644c140_1be7_4a25_becc_56acd5a0f042.slice/crio-ee4dc094a2d31c6d8f7052176afd47baa97b430a454e600907b87e0cd6fe2774 WatchSource:0}: Error finding container ee4dc094a2d31c6d8f7052176afd47baa97b430a454e600907b87e0cd6fe2774: Status 404 returned error can't find the container with id ee4dc094a2d31c6d8f7052176afd47baa97b430a454e600907b87e0cd6fe2774 Apr 17 16:31:54.123356 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:54.123284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rxw7n" event={"ID":"6644c140-1be7-4a25-becc-56acd5a0f042","Type":"ContainerStarted","Data":"ee4dc094a2d31c6d8f7052176afd47baa97b430a454e600907b87e0cd6fe2774"} Apr 17 16:31:54.127007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:54.126980 2572 generic.go:358] "Generic (PLEG): container finished" podID="759006e8-ec0f-48e4-bf68-b87d7dcbf08e" containerID="1e887d01e1470b0952bbe84bddb0e62625018363a901c7efeb8340bb59848eb0" exitCode=0 Apr 17 16:31:54.127144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:54.127029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerDied","Data":"1e887d01e1470b0952bbe84bddb0e62625018363a901c7efeb8340bb59848eb0"} Apr 17 16:31:54.510269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:54.510191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:54.510269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:54.510231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:54.510488 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:54.510351 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:54.510488 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:54.510352 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:54.510488 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:54.510418 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.510400634 +0000 UTC m=+41.097989582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:31:54.510488 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:54.510436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.510426361 +0000 UTC m=+41.098015316 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:31:55.132708 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:55.132677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" event={"ID":"759006e8-ec0f-48e4-bf68-b87d7dcbf08e","Type":"ContainerStarted","Data":"bb3fbc3b41b87e640e645770cfd6992e12ff90925d51ee6cf59aaf6a6da019e1"} Apr 17 16:31:55.154638 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:55.154593 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mnjjm" podStartSLOduration=6.796146594 podStartE2EDuration="38.154576352s" podCreationTimestamp="2026-04-17 16:31:17 +0000 UTC" firstStartedPulling="2026-04-17 16:31:20.68691694 +0000 UTC m=+3.274505901" lastFinishedPulling="2026-04-17 16:31:52.045346709 +0000 UTC m=+34.632935659" observedRunningTime="2026-04-17 16:31:55.153617926 +0000 UTC m=+37.741206898" watchObservedRunningTime="2026-04-17 16:31:55.154576352 +0000 UTC m=+37.742165366" Apr 17 16:31:57.137276 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:57.137242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rxw7n" event={"ID":"6644c140-1be7-4a25-becc-56acd5a0f042","Type":"ContainerStarted","Data":"c9d07e331a6e08dc23dd2217da5ebc51b7d788aed3f9ac6bee3d76bb5efc4a18"} Apr 17 16:31:57.151426 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:57.151380 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rxw7n" podStartSLOduration=32.647133885 podStartE2EDuration="36.151366877s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:53.501229284 +0000 UTC m=+36.088818231" lastFinishedPulling="2026-04-17 16:31:57.005462273 +0000 UTC m=+39.593051223" observedRunningTime="2026-04-17 16:31:57.151119794 +0000 UTC m=+39.738708764" watchObservedRunningTime="2026-04-17 16:31:57.151366877 +0000 UTC m=+39.738955846" Apr 17 16:31:58.539655 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:58.539609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:31:58.540006 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:31:58.539682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:31:58.540006 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:58.539770 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:58.540006 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:58.539771 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:58.540006 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:58.539819 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:06.539806169 +0000 UTC m=+49.127395116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:31:58.540006 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:31:58.539831 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:06.539825195 +0000 UTC m=+49.127414143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:32:02.504338 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:02.504284 2572 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb: reading manifest sha256:c530f8874aa89acf6d1834480b89067db882a7a0706e37c8fd9539a4401fdff0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb" Apr 17 16:32:02.504779 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:02.504535 2572 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:network-check-target-container,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb,Command:[cluster-network-check-target],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:K8S_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{15728640 0} {} 15Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b8m6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000560000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-check-target-xrrkr_openshift-network-diagnostics(e90a9ccd-0623-4495-a588-60c702965b82): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb: reading manifest sha256:c530f8874aa89acf6d1834480b89067db882a7a0706e37c8fd9539a4401fdff0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:32:02.505759 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:02.505709 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-check-target-container\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb: reading manifest sha256:c530f8874aa89acf6d1834480b89067db882a7a0706e37c8fd9539a4401fdff0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:32:03.152782 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:03.152742 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-check-target-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb: reading manifest sha256:c530f8874aa89acf6d1834480b89067db882a7a0706e37c8fd9539a4401fdff0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-network-diagnostics/network-check-target-xrrkr" podUID="e90a9ccd-0623-4495-a588-60c702965b82" Apr 17 16:32:06.594598 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:06.594561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:32:06.594598 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:06.594597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:32:06.595152 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:06.594687 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:06.595152 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:06.594697 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:06.595152 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:06.594766 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:22.594751557 +0000 UTC m=+65.182340508 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:32:06.595152 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:06.594780 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:22.594773669 +0000 UTC m=+65.182362617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:32:17.116520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:17.116489 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vj4bz" Apr 17 16:32:19.182035 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:19.182000 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xrrkr" event={"ID":"e90a9ccd-0623-4495-a588-60c702965b82","Type":"ContainerStarted","Data":"11f11747bd3c13cc06b0de3125e34043888d520d505ef3653392bb5026e3cfb6"} Apr 17 16:32:19.182395 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:19.182197 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:32:19.196391 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:19.196301 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xrrkr" podStartSLOduration=34.781389089 podStartE2EDuration="1m1.196288773s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:52.200970862 +0000 UTC m=+34.788559810" lastFinishedPulling="2026-04-17 16:32:18.615870543 +0000 UTC m=+61.203459494" observedRunningTime="2026-04-17 16:32:19.195820658 +0000 UTC m=+61.783409619" watchObservedRunningTime="2026-04-17 16:32:19.196288773 +0000 UTC m=+61.783877736" Apr 17 16:32:22.603253 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:22.603204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:32:22.603253 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:22.603254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:32:22.603674 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:22.603339 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:22.603674 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:22.603340 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:22.603674 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:22.603391 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:32:54.603378496 +0000 UTC m=+97.190967445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:32:22.603674 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:22.603406 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:54.603398529 +0000 UTC m=+97.190987477 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:32:23.709484 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:23.709443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:32:23.709893 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:23.709561 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:23.709893 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:23.709623 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs podName:36f3412d-e266-4f24-8ea6-1f3d3cdd2546 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:27.709609702 +0000 UTC m=+130.297198664 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs") pod "network-metrics-daemon-vw79z" (UID: "36f3412d-e266-4f24-8ea6-1f3d3cdd2546") : secret "metrics-daemon-secret" not found Apr 17 16:32:50.185953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:50.185827 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xrrkr" Apr 17 16:32:54.619997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:54.619965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:32:54.619997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:32:54.619999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:32:54.620400 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:54.620097 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:54.620400 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:54.620106 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:54.620400 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:54.620157 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls podName:7d470685-9573-40d7-b32c-929ed88cc56d nodeName:}" failed. No retries permitted until 2026-04-17 16:33:58.62014423 +0000 UTC m=+161.207733178 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls") pod "dns-default-4nqbv" (UID: "7d470685-9573-40d7-b32c-929ed88cc56d") : secret "dns-default-metrics-tls" not found Apr 17 16:32:54.620400 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:32:54.620169 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert podName:d60e97bd-f20c-497d-ae2a-6dac86b93c77 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:58.62016359 +0000 UTC m=+161.207752538 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert") pod "ingress-canary-gvlnm" (UID: "d60e97bd-f20c-497d-ae2a-6dac86b93c77") : secret "canary-serving-cert" not found Apr 17 16:33:00.444358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.444327 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6"] Apr 17 16:33:00.448813 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.448794 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf"] Apr 17 16:33:00.448956 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.448939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.452676 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.452647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.452971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.452953 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-qghlf"] Apr 17 16:33:00.453574 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.453554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" Apr 17 16:33:00.455231 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.455212 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 16:33:00.455324 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.455235 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.455324 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.455272 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 16:33:00.455433 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.455368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b7ztz\"" Apr 17 16:33:00.455784 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.455771 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.456082 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.456067 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.456341 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.456321 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l6mkp"] Apr 17 16:33:00.456485 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.456470 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.458582 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.458562 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 16:33:00.458675 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.458567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-l7vd2\"" Apr 17 16:33:00.458932 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.458913 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6"] Apr 17 16:33:00.459017 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.458960 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.459017 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.458992 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.459017 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.459010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.459777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.459516 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 16:33:00.459777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.459582 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-l99k2\"" Apr 17 16:33:00.461326 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.461308 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf"] Apr 17 16:33:00.461450 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.461434 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.461610 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.461593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 16:33:00.461686 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.461631 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 16:33:00.461686 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.461668 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.461997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.461982 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vtfzj\"" Apr 17 16:33:00.467276 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.467254 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 16:33:00.467370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.467334 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l6mkp"] Apr 17 16:33:00.469932 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.469913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 16:33:00.478279 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.478097 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-qghlf"] Apr 17 16:33:00.555700 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.555669 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7"] Apr 17 16:33:00.558570 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.558539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.561041 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb478791-e3d5-4b73-803c-4c43377c9ebc-serving-cert\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.561153 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzfx7\" (UniqueName: \"kubernetes.io/projected/cb478791-e3d5-4b73-803c-4c43377c9ebc-kube-api-access-xzfx7\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.561153 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4135c5b6-7f8a-4eaf-b551-405c8ab00981-trusted-ca\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.561153 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.561153 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c96l\" (UniqueName: \"kubernetes.io/projected/4135c5b6-7f8a-4eaf-b551-405c8ab00981-kube-api-access-4c96l\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.561153 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb478791-e3d5-4b73-803c-4c43377c9ebc-service-ca-bundle\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561189 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a280da6f-8899-47aa-ac6c-6b5ddcada842-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4135c5b6-7f8a-4eaf-b551-405c8ab00981-config\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8cd\" (UniqueName: \"kubernetes.io/projected/a280da6f-8899-47aa-ac6c-6b5ddcada842-kube-api-access-5s8cd\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb478791-e3d5-4b73-803c-4c43377c9ebc-tmp\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb478791-e3d5-4b73-803c-4c43377c9ebc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cb478791-e3d5-4b73-803c-4c43377c9ebc-snapshots\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.561354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4135c5b6-7f8a-4eaf-b551-405c8ab00981-serving-cert\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.561578 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.561395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlxj6\" (UniqueName: \"kubernetes.io/projected/7ad42cb5-577d-4b4b-b374-97b0a9270a0e-kube-api-access-zlxj6\") pod \"volume-data-source-validator-7c6cbb6c87-fpcqf\" (UID: \"7ad42cb5-577d-4b4b-b374-97b0a9270a0e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" Apr 17 16:33:00.562959 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.562942 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 16:33:00.563044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.562968 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 16:33:00.563561 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.563541 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.564392 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.564370 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.565025 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.564996 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-wqs65\"" Apr 17 16:33:00.577357 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.577337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7"] Apr 17 16:33:00.662438 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb478791-e3d5-4b73-803c-4c43377c9ebc-service-ca-bundle\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.662587 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a280da6f-8899-47aa-ac6c-6b5ddcada842-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.662587 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4135c5b6-7f8a-4eaf-b551-405c8ab00981-config\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.662587 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8cd\" (UniqueName: \"kubernetes.io/projected/a280da6f-8899-47aa-ac6c-6b5ddcada842-kube-api-access-5s8cd\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.662587 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.662587 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb478791-e3d5-4b73-803c-4c43377c9ebc-tmp\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb478791-e3d5-4b73-803c-4c43377c9ebc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cb478791-e3d5-4b73-803c-4c43377c9ebc-snapshots\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4135c5b6-7f8a-4eaf-b551-405c8ab00981-serving-cert\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsz56\" (UniqueName: \"kubernetes.io/projected/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-kube-api-access-jsz56\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlxj6\" (UniqueName: \"kubernetes.io/projected/7ad42cb5-577d-4b4b-b374-97b0a9270a0e-kube-api-access-zlxj6\") pod \"volume-data-source-validator-7c6cbb6c87-fpcqf\" (UID: \"7ad42cb5-577d-4b4b-b374-97b0a9270a0e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb478791-e3d5-4b73-803c-4c43377c9ebc-serving-cert\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzfx7\" (UniqueName: \"kubernetes.io/projected/cb478791-e3d5-4b73-803c-4c43377c9ebc-kube-api-access-xzfx7\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.662867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4135c5b6-7f8a-4eaf-b551-405c8ab00981-trusted-ca\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.663231 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662879 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-config\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.663231 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.662944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb478791-e3d5-4b73-803c-4c43377c9ebc-tmp\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.663231 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb478791-e3d5-4b73-803c-4c43377c9ebc-service-ca-bundle\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.663382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cb478791-e3d5-4b73-803c-4c43377c9ebc-snapshots\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.663382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663270 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4135c5b6-7f8a-4eaf-b551-405c8ab00981-config\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.663382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a280da6f-8899-47aa-ac6c-6b5ddcada842-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.663382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.663382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c96l\" (UniqueName: \"kubernetes.io/projected/4135c5b6-7f8a-4eaf-b551-405c8ab00981-kube-api-access-4c96l\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.663600 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:00.663456 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:00.663600 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:00.663529 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls podName:a280da6f-8899-47aa-ac6c-6b5ddcada842 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:01.163510182 +0000 UTC m=+103.751099135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v7vr6" (UID: "a280da6f-8899-47aa-ac6c-6b5ddcada842") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:00.663892 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb478791-e3d5-4b73-803c-4c43377c9ebc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.663986 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.663895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4135c5b6-7f8a-4eaf-b551-405c8ab00981-trusted-ca\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.666081 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.666059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb478791-e3d5-4b73-803c-4c43377c9ebc-serving-cert\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.666170 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.666151 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4135c5b6-7f8a-4eaf-b551-405c8ab00981-serving-cert\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.671650 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.671626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8cd\" (UniqueName: \"kubernetes.io/projected/a280da6f-8899-47aa-ac6c-6b5ddcada842-kube-api-access-5s8cd\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:00.673503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.673483 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlxj6\" (UniqueName: \"kubernetes.io/projected/7ad42cb5-577d-4b4b-b374-97b0a9270a0e-kube-api-access-zlxj6\") pod \"volume-data-source-validator-7c6cbb6c87-fpcqf\" (UID: \"7ad42cb5-577d-4b4b-b374-97b0a9270a0e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" Apr 17 16:33:00.674028 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.674004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c96l\" (UniqueName: \"kubernetes.io/projected/4135c5b6-7f8a-4eaf-b551-405c8ab00981-kube-api-access-4c96l\") pod \"console-operator-9d4b6777b-qghlf\" (UID: \"4135c5b6-7f8a-4eaf-b551-405c8ab00981\") " pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.674199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.674140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzfx7\" (UniqueName: \"kubernetes.io/projected/cb478791-e3d5-4b73-803c-4c43377c9ebc-kube-api-access-xzfx7\") pod \"insights-operator-585dfdc468-l6mkp\" (UID: \"cb478791-e3d5-4b73-803c-4c43377c9ebc\") " pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.764328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.764236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.764328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.764280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsz56\" (UniqueName: \"kubernetes.io/projected/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-kube-api-access-jsz56\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.764556 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.764540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-config\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.765044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.765025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-config\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.766589 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.766569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.769595 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.769572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" Apr 17 16:33:00.772601 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.772581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsz56\" (UniqueName: \"kubernetes.io/projected/f71c20f2-eb49-4550-bc66-4d1973c1fdc8-kube-api-access-jsz56\") pod \"service-ca-operator-d6fc45fc5-cm9d7\" (UID: \"f71c20f2-eb49-4550-bc66-4d1973c1fdc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.774545 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.774527 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:00.781188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.781171 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" Apr 17 16:33:00.867682 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.867343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" Apr 17 16:33:00.905923 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.905889 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf"] Apr 17 16:33:00.909574 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:00.909544 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad42cb5_577d_4b4b_b374_97b0a9270a0e.slice/crio-6bfd8f595c62d3c7b8f5bdfd597f85172c0d1e83eeac3c25e08d430b19bca3da WatchSource:0}: Error finding container 6bfd8f595c62d3c7b8f5bdfd597f85172c0d1e83eeac3c25e08d430b19bca3da: Status 404 returned error can't find the container with id 6bfd8f595c62d3c7b8f5bdfd597f85172c0d1e83eeac3c25e08d430b19bca3da Apr 17 16:33:00.989658 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:00.989627 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7"] Apr 17 16:33:00.992411 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:00.992384 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf71c20f2_eb49_4550_bc66_4d1973c1fdc8.slice/crio-8da0aac9463063db9f18e3f874f6e0b8aad7ca651b8a1d9f50d9049169cdf3fe WatchSource:0}: Error finding container 8da0aac9463063db9f18e3f874f6e0b8aad7ca651b8a1d9f50d9049169cdf3fe: Status 404 returned error can't find the container with id 8da0aac9463063db9f18e3f874f6e0b8aad7ca651b8a1d9f50d9049169cdf3fe Apr 17 16:33:01.123264 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.123236 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-qghlf"] Apr 17 16:33:01.125945 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.125856 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l6mkp"] Apr 17 16:33:01.127891 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:01.127861 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4135c5b6_7f8a_4eaf_b551_405c8ab00981.slice/crio-1c6e84ebfd89f37322112dae1de923b8e6af2ea04c7d2f01004fd937784944f2 WatchSource:0}: Error finding container 1c6e84ebfd89f37322112dae1de923b8e6af2ea04c7d2f01004fd937784944f2: Status 404 returned error can't find the container with id 1c6e84ebfd89f37322112dae1de923b8e6af2ea04c7d2f01004fd937784944f2 Apr 17 16:33:01.130384 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:01.129995 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb478791_e3d5_4b73_803c_4c43377c9ebc.slice/crio-87656c0d9651986f257524eaed8bdc1e62adb34adf9c1649f4a8204057575a49 WatchSource:0}: Error finding container 87656c0d9651986f257524eaed8bdc1e62adb34adf9c1649f4a8204057575a49: Status 404 returned error can't find the container with id 87656c0d9651986f257524eaed8bdc1e62adb34adf9c1649f4a8204057575a49 Apr 17 16:33:01.167829 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.167807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:01.167928 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:01.167917 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:01.167977 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:01.167971 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls podName:a280da6f-8899-47aa-ac6c-6b5ddcada842 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.16795781 +0000 UTC m=+104.755546757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v7vr6" (UID: "a280da6f-8899-47aa-ac6c-6b5ddcada842") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:01.261225 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.261188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" event={"ID":"f71c20f2-eb49-4550-bc66-4d1973c1fdc8","Type":"ContainerStarted","Data":"8da0aac9463063db9f18e3f874f6e0b8aad7ca651b8a1d9f50d9049169cdf3fe"} Apr 17 16:33:01.262239 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.262208 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" event={"ID":"7ad42cb5-577d-4b4b-b374-97b0a9270a0e","Type":"ContainerStarted","Data":"6bfd8f595c62d3c7b8f5bdfd597f85172c0d1e83eeac3c25e08d430b19bca3da"} Apr 17 16:33:01.263175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.263149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" event={"ID":"cb478791-e3d5-4b73-803c-4c43377c9ebc","Type":"ContainerStarted","Data":"87656c0d9651986f257524eaed8bdc1e62adb34adf9c1649f4a8204057575a49"} Apr 17 16:33:01.264051 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:01.264030 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" event={"ID":"4135c5b6-7f8a-4eaf-b551-405c8ab00981","Type":"ContainerStarted","Data":"1c6e84ebfd89f37322112dae1de923b8e6af2ea04c7d2f01004fd937784944f2"} Apr 17 16:33:02.174593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:02.174462 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:02.175076 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:02.174623 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:02.175076 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:02.174708 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls podName:a280da6f-8899-47aa-ac6c-6b5ddcada842 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:04.174686882 +0000 UTC m=+106.762275846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v7vr6" (UID: "a280da6f-8899-47aa-ac6c-6b5ddcada842") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:03.270158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:03.270122 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" event={"ID":"7ad42cb5-577d-4b4b-b374-97b0a9270a0e","Type":"ContainerStarted","Data":"cdd554b7740d5e0f1367aec0dc33152ff48a40ab7a3504e8320bdce0b29eb617"} Apr 17 16:33:03.285821 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:03.285762 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fpcqf" podStartSLOduration=1.7403214519999999 podStartE2EDuration="3.285747794s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:00.911258448 +0000 UTC m=+103.498847396" lastFinishedPulling="2026-04-17 16:33:02.456684775 +0000 UTC m=+105.044273738" observedRunningTime="2026-04-17 16:33:03.284582359 +0000 UTC m=+105.872171331" watchObservedRunningTime="2026-04-17 16:33:03.285747794 +0000 UTC m=+105.873336763" Apr 17 16:33:04.195585 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.195558 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:04.195741 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:04.195699 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:04.195830 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:04.195818 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls podName:a280da6f-8899-47aa-ac6c-6b5ddcada842 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:08.195796156 +0000 UTC m=+110.783385122 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v7vr6" (UID: "a280da6f-8899-47aa-ac6c-6b5ddcada842") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:04.275617 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.275564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" event={"ID":"cb478791-e3d5-4b73-803c-4c43377c9ebc","Type":"ContainerStarted","Data":"48ab67fcc7161ac7bd8c0ceb076dc351145abfa962dc0c1f814da1caf4104eb3"} Apr 17 16:33:04.277865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.277680 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" event={"ID":"4135c5b6-7f8a-4eaf-b551-405c8ab00981","Type":"ContainerStarted","Data":"f1b9ba59dd4fe53551f691d3401b9ba1ac9250d88b0f9558b1e687ea22fd1614"} Apr 17 16:33:04.277865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.277841 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:04.279144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.279121 2572 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-qghlf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.133.0.10:8443/readyz\": dial tcp 10.133.0.10:8443: connect: connection refused" start-of-body= Apr 17 16:33:04.279225 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.279161 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" podUID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" containerName="console-operator" probeResult="failure" output="Get \"https://10.133.0.10:8443/readyz\": dial tcp 10.133.0.10:8443: connect: connection refused" Apr 17 16:33:04.279916 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.279893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" event={"ID":"f71c20f2-eb49-4550-bc66-4d1973c1fdc8","Type":"ContainerStarted","Data":"727c08970f719065c0f9c93acd41a9e61c1ee6f4a2fa4da679fac62ff400a14e"} Apr 17 16:33:04.309922 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.309875 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" podStartSLOduration=1.287664237 podStartE2EDuration="4.309860944s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:01.132018631 +0000 UTC m=+103.719607582" lastFinishedPulling="2026-04-17 16:33:04.154215328 +0000 UTC m=+106.741804289" observedRunningTime="2026-04-17 16:33:04.290972924 +0000 UTC m=+106.878561896" watchObservedRunningTime="2026-04-17 16:33:04.309860944 +0000 UTC m=+106.897449913" Apr 17 16:33:04.310375 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.310344 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" podStartSLOduration=1.15349883 podStartE2EDuration="4.310337179s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:00.994188898 +0000 UTC m=+103.581777846" lastFinishedPulling="2026-04-17 16:33:04.151027246 +0000 UTC m=+106.738616195" observedRunningTime="2026-04-17 16:33:04.309693531 +0000 UTC m=+106.897282508" watchObservedRunningTime="2026-04-17 16:33:04.310337179 +0000 UTC m=+106.897926144" Apr 17 16:33:04.324873 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:04.324752 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" podStartSLOduration=1.299567581 podStartE2EDuration="4.324707794s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:01.13139935 +0000 UTC m=+103.718988299" lastFinishedPulling="2026-04-17 16:33:04.156539565 +0000 UTC m=+106.744128512" observedRunningTime="2026-04-17 16:33:04.324648761 +0000 UTC m=+106.912237732" watchObservedRunningTime="2026-04-17 16:33:04.324707794 +0000 UTC m=+106.912296766" Apr 17 16:33:05.117790 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.117754 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc"] Apr 17 16:33:05.120675 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.120660 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" Apr 17 16:33:05.123072 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.123044 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 16:33:05.123072 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.123065 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:05.123890 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.123871 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-x2ltf\"" Apr 17 16:33:05.138240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.138214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc"] Apr 17 16:33:05.204736 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.204696 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wjpz\" (UniqueName: \"kubernetes.io/projected/79cd9bd5-6c64-4090-a94b-e2339628ff70-kube-api-access-7wjpz\") pod \"migrator-74bb7799d9-lbmtc\" (UID: \"79cd9bd5-6c64-4090-a94b-e2339628ff70\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" Apr 17 16:33:05.284113 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.284087 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/0.log" Apr 17 16:33:05.284601 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.284578 2572 generic.go:358] "Generic (PLEG): container finished" podID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" containerID="f1b9ba59dd4fe53551f691d3401b9ba1ac9250d88b0f9558b1e687ea22fd1614" exitCode=255 Apr 17 16:33:05.285499 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.285475 2572 scope.go:117] "RemoveContainer" containerID="f1b9ba59dd4fe53551f691d3401b9ba1ac9250d88b0f9558b1e687ea22fd1614" Apr 17 16:33:05.288062 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.285737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" event={"ID":"4135c5b6-7f8a-4eaf-b551-405c8ab00981","Type":"ContainerDied","Data":"f1b9ba59dd4fe53551f691d3401b9ba1ac9250d88b0f9558b1e687ea22fd1614"} Apr 17 16:33:05.305555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.305525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wjpz\" (UniqueName: \"kubernetes.io/projected/79cd9bd5-6c64-4090-a94b-e2339628ff70-kube-api-access-7wjpz\") pod \"migrator-74bb7799d9-lbmtc\" (UID: \"79cd9bd5-6c64-4090-a94b-e2339628ff70\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" Apr 17 16:33:05.313305 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.313277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wjpz\" (UniqueName: \"kubernetes.io/projected/79cd9bd5-6c64-4090-a94b-e2339628ff70-kube-api-access-7wjpz\") pod \"migrator-74bb7799d9-lbmtc\" (UID: \"79cd9bd5-6c64-4090-a94b-e2339628ff70\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" Apr 17 16:33:05.430443 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.430413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" Apr 17 16:33:05.546543 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:05.546513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc"] Apr 17 16:33:05.550040 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:05.550007 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79cd9bd5_6c64_4090_a94b_e2339628ff70.slice/crio-6c52d65dc773d10920a129cbf4c33a89628a1f0f8500feb63f6847cdcff2d4d2 WatchSource:0}: Error finding container 6c52d65dc773d10920a129cbf4c33a89628a1f0f8500feb63f6847cdcff2d4d2: Status 404 returned error can't find the container with id 6c52d65dc773d10920a129cbf4c33a89628a1f0f8500feb63f6847cdcff2d4d2 Apr 17 16:33:06.288595 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.288553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" event={"ID":"79cd9bd5-6c64-4090-a94b-e2339628ff70","Type":"ContainerStarted","Data":"6c52d65dc773d10920a129cbf4c33a89628a1f0f8500feb63f6847cdcff2d4d2"} Apr 17 16:33:06.289949 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.289917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:33:06.290389 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.290369 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/0.log" Apr 17 16:33:06.290511 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.290414 2572 generic.go:358] "Generic (PLEG): container finished" podID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" containerID="0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0" exitCode=255 Apr 17 16:33:06.290511 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.290455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" event={"ID":"4135c5b6-7f8a-4eaf-b551-405c8ab00981","Type":"ContainerDied","Data":"0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0"} Apr 17 16:33:06.290511 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.290484 2572 scope.go:117] "RemoveContainer" containerID="f1b9ba59dd4fe53551f691d3401b9ba1ac9250d88b0f9558b1e687ea22fd1614" Apr 17 16:33:06.290797 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:06.290778 2572 scope.go:117] "RemoveContainer" containerID="0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0" Apr 17 16:33:06.290975 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:06.290956 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qghlf_openshift-console-operator(4135c5b6-7f8a-4eaf-b551-405c8ab00981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" podUID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" Apr 17 16:33:07.297892 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:07.297868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:33:07.298319 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:07.298248 2572 scope.go:117] "RemoveContainer" containerID="0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0" Apr 17 16:33:07.298466 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:07.298445 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qghlf_openshift-console-operator(4135c5b6-7f8a-4eaf-b551-405c8ab00981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" podUID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" Apr 17 16:33:07.299397 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:07.299373 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" event={"ID":"79cd9bd5-6c64-4090-a94b-e2339628ff70","Type":"ContainerStarted","Data":"325e908767ac7e04652abd9b2a20f0bdcde17ac0160b25da312347b1c4c6c4b6"} Apr 17 16:33:07.299397 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:07.299400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" event={"ID":"79cd9bd5-6c64-4090-a94b-e2339628ff70","Type":"ContainerStarted","Data":"0a684268dc6975b0aadd2854cb21a8d6ab36f02667e1b300c081cf5b12bfb467"} Apr 17 16:33:07.328936 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:07.328896 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lbmtc" podStartSLOduration=1.298228721 podStartE2EDuration="2.328884371s" podCreationTimestamp="2026-04-17 16:33:05 +0000 UTC" firstStartedPulling="2026-04-17 16:33:05.552278151 +0000 UTC m=+108.139867102" lastFinishedPulling="2026-04-17 16:33:06.582933804 +0000 UTC m=+109.170522752" observedRunningTime="2026-04-17 16:33:07.327878333 +0000 UTC m=+109.915467302" watchObservedRunningTime="2026-04-17 16:33:07.328884371 +0000 UTC m=+109.916473341" Apr 17 16:33:08.097188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.097157 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5q2mv_8528bd48-4f37-4fef-bd6e-7df9d6a2773f/dns-node-resolver/0.log" Apr 17 16:33:08.110546 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.110507 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-skfw9"] Apr 17 16:33:08.113378 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.113363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.115594 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.115571 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 16:33:08.116673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.116648 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 16:33:08.116673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.116652 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 16:33:08.116840 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.116651 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 16:33:08.116840 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.116701 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-68s6z\"" Apr 17 16:33:08.119895 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.119859 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-skfw9"] Apr 17 16:33:08.227573 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.227544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:08.227713 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.227586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/104602d2-0107-4855-958f-ec4a00d8bb04-signing-key\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.227713 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.227620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/104602d2-0107-4855-958f-ec4a00d8bb04-signing-cabundle\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.227713 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:08.227690 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:08.227713 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.227731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxvd\" (UniqueName: \"kubernetes.io/projected/104602d2-0107-4855-958f-ec4a00d8bb04-kube-api-access-9bxvd\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.227878 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:08.227764 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls podName:a280da6f-8899-47aa-ac6c-6b5ddcada842 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:16.227750129 +0000 UTC m=+118.815339077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v7vr6" (UID: "a280da6f-8899-47aa-ac6c-6b5ddcada842") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:08.328134 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.328104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/104602d2-0107-4855-958f-ec4a00d8bb04-signing-key\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.328458 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.328156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/104602d2-0107-4855-958f-ec4a00d8bb04-signing-cabundle\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.328458 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.328185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxvd\" (UniqueName: \"kubernetes.io/projected/104602d2-0107-4855-958f-ec4a00d8bb04-kube-api-access-9bxvd\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.328901 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.328878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/104602d2-0107-4855-958f-ec4a00d8bb04-signing-cabundle\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.330549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.330527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/104602d2-0107-4855-958f-ec4a00d8bb04-signing-key\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.336417 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.336389 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxvd\" (UniqueName: \"kubernetes.io/projected/104602d2-0107-4855-958f-ec4a00d8bb04-kube-api-access-9bxvd\") pod \"service-ca-865cb79987-skfw9\" (UID: \"104602d2-0107-4855-958f-ec4a00d8bb04\") " pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.422072 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.422009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-skfw9" Apr 17 16:33:08.534847 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:08.534819 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-skfw9"] Apr 17 16:33:08.537785 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:08.537757 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod104602d2_0107_4855_958f_ec4a00d8bb04.slice/crio-897b5cbece7e67838dbf6cd63121dec7aed38656771b1c50d1028881df692b8c WatchSource:0}: Error finding container 897b5cbece7e67838dbf6cd63121dec7aed38656771b1c50d1028881df692b8c: Status 404 returned error can't find the container with id 897b5cbece7e67838dbf6cd63121dec7aed38656771b1c50d1028881df692b8c Apr 17 16:33:09.291888 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:09.291865 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gmh4h_2685d399-ce45-4aec-bf5c-ce5d17cb16f4/node-ca/0.log" Apr 17 16:33:09.305496 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:09.305464 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-skfw9" event={"ID":"104602d2-0107-4855-958f-ec4a00d8bb04","Type":"ContainerStarted","Data":"e9226b5495f67104bb04d32304b3bfa73e620d8a2dd082d0b54a2f7829b986fd"} Apr 17 16:33:09.305640 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:09.305503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-skfw9" event={"ID":"104602d2-0107-4855-958f-ec4a00d8bb04","Type":"ContainerStarted","Data":"897b5cbece7e67838dbf6cd63121dec7aed38656771b1c50d1028881df692b8c"} Apr 17 16:33:09.322422 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:09.322375 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-skfw9" podStartSLOduration=1.322361874 podStartE2EDuration="1.322361874s" podCreationTimestamp="2026-04-17 16:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:09.320819036 +0000 UTC m=+111.908408005" watchObservedRunningTime="2026-04-17 16:33:09.322361874 +0000 UTC m=+111.909950845" Apr 17 16:33:10.293029 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:10.292992 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lbmtc_79cd9bd5-6c64-4090-a94b-e2339628ff70/migrator/0.log" Apr 17 16:33:10.492451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:10.492418 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lbmtc_79cd9bd5-6c64-4090-a94b-e2339628ff70/graceful-termination/0.log" Apr 17 16:33:10.774971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:10.774932 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:10.775295 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:10.775283 2572 scope.go:117] "RemoveContainer" containerID="0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0" Apr 17 16:33:10.775474 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:10.775458 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qghlf_openshift-console-operator(4135c5b6-7f8a-4eaf-b551-405c8ab00981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" podUID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" Apr 17 16:33:14.278161 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:14.278128 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:14.278558 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:14.278474 2572 scope.go:117] "RemoveContainer" containerID="0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0" Apr 17 16:33:14.278648 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:14.278631 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-qghlf_openshift-console-operator(4135c5b6-7f8a-4eaf-b551-405c8ab00981)\"" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" podUID="4135c5b6-7f8a-4eaf-b551-405c8ab00981" Apr 17 16:33:16.288512 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:16.288478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:16.288903 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:16.288621 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:16.288903 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:16.288693 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls podName:a280da6f-8899-47aa-ac6c-6b5ddcada842 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:32.288671786 +0000 UTC m=+134.876260734 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-v7vr6" (UID: "a280da6f-8899-47aa-ac6c-6b5ddcada842") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:25.969390 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:25.969357 2572 scope.go:117] "RemoveContainer" containerID="0f7b4d2da46431e827e1c5738528fc88a9de8889accdfaed142977871f3f9ce0" Apr 17 16:33:26.293509 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.293437 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-87jqc"] Apr 17 16:33:26.296550 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.296534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.300612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.300591 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:33:26.300612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.300591 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-shqjl\"" Apr 17 16:33:26.304640 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.304620 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:33:26.315354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.315331 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87jqc"] Apr 17 16:33:26.347817 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.347800 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:33:26.347926 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.347863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" event={"ID":"4135c5b6-7f8a-4eaf-b551-405c8ab00981","Type":"ContainerStarted","Data":"171bdd99b3deff7dd61a3742743eeaaff5f6ffec432b05f6b11daa02ee5eaaa0"} Apr 17 16:33:26.348115 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.348100 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:26.366895 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.366860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/017a011d-f629-47fd-9ac9-16112dbabf6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.367012 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.366933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/017a011d-f629-47fd-9ac9-16112dbabf6a-crio-socket\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.367012 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.366985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/017a011d-f629-47fd-9ac9-16112dbabf6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.367118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.367036 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/017a011d-f629-47fd-9ac9-16112dbabf6a-data-volume\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.367118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.367068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sbqx\" (UniqueName: \"kubernetes.io/projected/017a011d-f629-47fd-9ac9-16112dbabf6a-kube-api-access-2sbqx\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.467613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.467583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/017a011d-f629-47fd-9ac9-16112dbabf6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.467787 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.467762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/017a011d-f629-47fd-9ac9-16112dbabf6a-crio-socket\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.467897 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.467879 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/017a011d-f629-47fd-9ac9-16112dbabf6a-crio-socket\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.467971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.467878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/017a011d-f629-47fd-9ac9-16112dbabf6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.467971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.467939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/017a011d-f629-47fd-9ac9-16112dbabf6a-data-volume\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.468076 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.467974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sbqx\" (UniqueName: \"kubernetes.io/projected/017a011d-f629-47fd-9ac9-16112dbabf6a-kube-api-access-2sbqx\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.468271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.468241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/017a011d-f629-47fd-9ac9-16112dbabf6a-data-volume\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.468271 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.468252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/017a011d-f629-47fd-9ac9-16112dbabf6a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.470328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.470308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/017a011d-f629-47fd-9ac9-16112dbabf6a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.480180 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.480154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sbqx\" (UniqueName: \"kubernetes.io/projected/017a011d-f629-47fd-9ac9-16112dbabf6a-kube-api-access-2sbqx\") pod \"insights-runtime-extractor-87jqc\" (UID: \"017a011d-f629-47fd-9ac9-16112dbabf6a\") " pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.605569 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.605490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87jqc" Apr 17 16:33:26.738285 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.738257 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87jqc"] Apr 17 16:33:26.741419 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:26.741392 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017a011d_f629_47fd_9ac9_16112dbabf6a.slice/crio-04a07f8b0d84765f6ed70e0c7db9afc7eb72f514d44a2fda120ba706c9751cca WatchSource:0}: Error finding container 04a07f8b0d84765f6ed70e0c7db9afc7eb72f514d44a2fda120ba706c9751cca: Status 404 returned error can't find the container with id 04a07f8b0d84765f6ed70e0c7db9afc7eb72f514d44a2fda120ba706c9751cca Apr 17 16:33:26.862959 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:26.862875 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-qghlf" Apr 17 16:33:27.040710 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.040669 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-62lmr"] Apr 17 16:33:27.043829 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.043811 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:27.046088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.046069 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-792lt\"" Apr 17 16:33:27.046178 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.046112 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:33:27.046178 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.046131 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:33:27.052102 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.052080 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-62lmr"] Apr 17 16:33:27.173745 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.173658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frjjx\" (UniqueName: \"kubernetes.io/projected/9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa-kube-api-access-frjjx\") pod \"downloads-6bcc868b7-62lmr\" (UID: \"9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa\") " pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:27.274726 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.274691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frjjx\" (UniqueName: \"kubernetes.io/projected/9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa-kube-api-access-frjjx\") pod \"downloads-6bcc868b7-62lmr\" (UID: \"9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa\") " pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:27.283372 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.283348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frjjx\" (UniqueName: \"kubernetes.io/projected/9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa-kube-api-access-frjjx\") pod \"downloads-6bcc868b7-62lmr\" (UID: \"9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa\") " pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:27.351038 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.351008 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87jqc" event={"ID":"017a011d-f629-47fd-9ac9-16112dbabf6a","Type":"ContainerStarted","Data":"efecaac2971a3c632bbb451ebe1d03265a1181d97fac60bfbd0bee1b83c42872"} Apr 17 16:33:27.351141 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.351050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87jqc" event={"ID":"017a011d-f629-47fd-9ac9-16112dbabf6a","Type":"ContainerStarted","Data":"04a07f8b0d84765f6ed70e0c7db9afc7eb72f514d44a2fda120ba706c9751cca"} Apr 17 16:33:27.352629 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.352609 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:27.496741 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.496697 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-62lmr"] Apr 17 16:33:27.499238 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:27.499211 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5ed73a_d8e4_4937_b8df_b1d3c48f6efa.slice/crio-545a869ff7445a09906d231862dddd10b3ea9bd049771b589787872e44940305 WatchSource:0}: Error finding container 545a869ff7445a09906d231862dddd10b3ea9bd049771b589787872e44940305: Status 404 returned error can't find the container with id 545a869ff7445a09906d231862dddd10b3ea9bd049771b589787872e44940305 Apr 17 16:33:27.779328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.779253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:33:27.781596 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.781578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36f3412d-e266-4f24-8ea6-1f3d3cdd2546-metrics-certs\") pod \"network-metrics-daemon-vw79z\" (UID: \"36f3412d-e266-4f24-8ea6-1f3d3cdd2546\") " pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:33:27.894465 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.894432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qkztw\"" Apr 17 16:33:27.902573 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:27.902551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vw79z" Apr 17 16:33:28.030848 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:28.030778 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vw79z"] Apr 17 16:33:28.034300 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:28.034271 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f3412d_e266_4f24_8ea6_1f3d3cdd2546.slice/crio-bfbf0e94585fae0f985ac5021fe3ebba5c1bab355ff3d71a10a79ac46b1d129d WatchSource:0}: Error finding container bfbf0e94585fae0f985ac5021fe3ebba5c1bab355ff3d71a10a79ac46b1d129d: Status 404 returned error can't find the container with id bfbf0e94585fae0f985ac5021fe3ebba5c1bab355ff3d71a10a79ac46b1d129d Apr 17 16:33:28.357618 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:28.355361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-62lmr" event={"ID":"9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa","Type":"ContainerStarted","Data":"545a869ff7445a09906d231862dddd10b3ea9bd049771b589787872e44940305"} Apr 17 16:33:28.357618 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:28.356663 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vw79z" event={"ID":"36f3412d-e266-4f24-8ea6-1f3d3cdd2546","Type":"ContainerStarted","Data":"bfbf0e94585fae0f985ac5021fe3ebba5c1bab355ff3d71a10a79ac46b1d129d"} Apr 17 16:33:28.358698 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:28.358665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87jqc" event={"ID":"017a011d-f629-47fd-9ac9-16112dbabf6a","Type":"ContainerStarted","Data":"84ff747c0ccb237eb21b3e0c52ec55013387cd7627219078e6dd14d87b837ea4"} Apr 17 16:33:30.366263 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:30.366221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vw79z" event={"ID":"36f3412d-e266-4f24-8ea6-1f3d3cdd2546","Type":"ContainerStarted","Data":"202437c96a98f3fd4b4fd5f211f81f3b9836529237fa4f74d27b55fdf77966df"} Apr 17 16:33:30.366263 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:30.366268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vw79z" event={"ID":"36f3412d-e266-4f24-8ea6-1f3d3cdd2546","Type":"ContainerStarted","Data":"2472fd057d6914b38c91650b6d589221be1f84d59b22c4d243ef0858e26994d1"} Apr 17 16:33:30.368523 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:30.368485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87jqc" event={"ID":"017a011d-f629-47fd-9ac9-16112dbabf6a","Type":"ContainerStarted","Data":"97188d2c0dc9a5592b1081948452ee8d9fbecc9dedfc494349b8497fa9dc0625"} Apr 17 16:33:30.383555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:30.383492 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vw79z" podStartSLOduration=130.788713202 podStartE2EDuration="2m12.383477845s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:33:28.036861157 +0000 UTC m=+130.624450105" lastFinishedPulling="2026-04-17 16:33:29.631625796 +0000 UTC m=+132.219214748" observedRunningTime="2026-04-17 16:33:30.383049679 +0000 UTC m=+132.970638651" watchObservedRunningTime="2026-04-17 16:33:30.383477845 +0000 UTC m=+132.971066815" Apr 17 16:33:30.404560 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:30.404498 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-87jqc" podStartSLOduration=1.577369928 podStartE2EDuration="4.404482637s" podCreationTimestamp="2026-04-17 16:33:26 +0000 UTC" firstStartedPulling="2026-04-17 16:33:26.802549811 +0000 UTC m=+129.390138758" lastFinishedPulling="2026-04-17 16:33:29.629662507 +0000 UTC m=+132.217251467" observedRunningTime="2026-04-17 16:33:30.404196196 +0000 UTC m=+132.991785168" watchObservedRunningTime="2026-04-17 16:33:30.404482637 +0000 UTC m=+132.992071608" Apr 17 16:33:32.318021 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:32.317982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:32.321070 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:32.321023 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280da6f-8899-47aa-ac6c-6b5ddcada842-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-v7vr6\" (UID: \"a280da6f-8899-47aa-ac6c-6b5ddcada842\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:32.565880 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:32.565846 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b7ztz\"" Apr 17 16:33:32.574100 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:32.574034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" Apr 17 16:33:32.724225 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:32.724171 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6"] Apr 17 16:33:32.728126 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:32.728094 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda280da6f_8899_47aa_ac6c_6b5ddcada842.slice/crio-4463406f24361fb1ef7d16593eda063ac147438e1c6c8865f5a44b86cf6e55c4 WatchSource:0}: Error finding container 4463406f24361fb1ef7d16593eda063ac147438e1c6c8865f5a44b86cf6e55c4: Status 404 returned error can't find the container with id 4463406f24361fb1ef7d16593eda063ac147438e1c6c8865f5a44b86cf6e55c4 Apr 17 16:33:33.380971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.380912 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" event={"ID":"a280da6f-8899-47aa-ac6c-6b5ddcada842","Type":"ContainerStarted","Data":"4463406f24361fb1ef7d16593eda063ac147438e1c6c8865f5a44b86cf6e55c4"} Apr 17 16:33:33.403086 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.403052 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b87c657d-gcmrf"] Apr 17 16:33:33.405253 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.405226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.408551 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.408529 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:33:33.408756 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.408739 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:33:33.409952 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.409441 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:33:33.409952 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.409490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:33:33.409952 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.409706 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:33:33.409952 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.409879 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-r5h94\"" Apr 17 16:33:33.422182 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.422141 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b87c657d-gcmrf"] Apr 17 16:33:33.528135 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.528089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzz9\" (UniqueName: \"kubernetes.io/projected/cfaadc66-808a-432f-b9e6-adfce98dfbac-kube-api-access-7vzz9\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.528315 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.528220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-oauth-config\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.528315 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.528263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-oauth-serving-cert\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.528315 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.528309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-service-ca\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.528479 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.528345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-serving-cert\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.528479 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.528394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-config\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.629002 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.628966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-oauth-config\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.629172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.629020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-oauth-serving-cert\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.629172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.629060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-service-ca\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.629172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.629109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-serving-cert\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.629172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.629160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-config\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.629339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.629195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzz9\" (UniqueName: \"kubernetes.io/projected/cfaadc66-808a-432f-b9e6-adfce98dfbac-kube-api-access-7vzz9\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.630230 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.630200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-service-ca\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.630346 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.630244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-oauth-serving-cert\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.630346 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.630200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-config\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.632536 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.632471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-oauth-config\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.632992 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.632968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-serving-cert\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.638210 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.638184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzz9\" (UniqueName: \"kubernetes.io/projected/cfaadc66-808a-432f-b9e6-adfce98dfbac-kube-api-access-7vzz9\") pod \"console-69b87c657d-gcmrf\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.718547 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.718502 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:33.862194 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:33.862162 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b87c657d-gcmrf"] Apr 17 16:33:33.866272 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:33.866238 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfaadc66_808a_432f_b9e6_adfce98dfbac.slice/crio-cf1d122e2302c5a6e8aec945f8eca84585cea6321d6b058fee6d8a44b1a5167f WatchSource:0}: Error finding container cf1d122e2302c5a6e8aec945f8eca84585cea6321d6b058fee6d8a44b1a5167f: Status 404 returned error can't find the container with id cf1d122e2302c5a6e8aec945f8eca84585cea6321d6b058fee6d8a44b1a5167f Apr 17 16:33:34.385752 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:34.385694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b87c657d-gcmrf" event={"ID":"cfaadc66-808a-432f-b9e6-adfce98dfbac","Type":"ContainerStarted","Data":"cf1d122e2302c5a6e8aec945f8eca84585cea6321d6b058fee6d8a44b1a5167f"} Apr 17 16:33:35.392658 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:35.392577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" event={"ID":"a280da6f-8899-47aa-ac6c-6b5ddcada842","Type":"ContainerStarted","Data":"b5e97230eeb4023375b9fa895a8d3ff6075d385a4b548a907cad5c285b1c8e3b"} Apr 17 16:33:35.409739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:35.409150 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-v7vr6" podStartSLOduration=33.611502956 podStartE2EDuration="35.409133238s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:32.730327493 +0000 UTC m=+135.317916454" lastFinishedPulling="2026-04-17 16:33:34.527957788 +0000 UTC m=+137.115546736" observedRunningTime="2026-04-17 16:33:35.408524547 +0000 UTC m=+137.996113515" watchObservedRunningTime="2026-04-17 16:33:35.409133238 +0000 UTC m=+137.996722211" Apr 17 16:33:38.130739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.130675 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jgxs5"] Apr 17 16:33:38.147181 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.147132 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jgxs5"] Apr 17 16:33:38.147348 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.147271 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.149885 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.149864 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 16:33:38.150203 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.149875 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 16:33:38.150422 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.149919 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:33:38.150543 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.149993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6k4t4\"" Apr 17 16:33:38.274552 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.274504 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0a45005-1762-44a5-a570-e1300c5a70e4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.274775 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.274571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0a45005-1762-44a5-a570-e1300c5a70e4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.274775 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.274651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0a45005-1762-44a5-a570-e1300c5a70e4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.274895 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.274760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfcjm\" (UniqueName: \"kubernetes.io/projected/f0a45005-1762-44a5-a570-e1300c5a70e4-kube-api-access-wfcjm\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.375310 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.375272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfcjm\" (UniqueName: \"kubernetes.io/projected/f0a45005-1762-44a5-a570-e1300c5a70e4-kube-api-access-wfcjm\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.375472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.375358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0a45005-1762-44a5-a570-e1300c5a70e4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.375472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.375398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0a45005-1762-44a5-a570-e1300c5a70e4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.375472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.375454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0a45005-1762-44a5-a570-e1300c5a70e4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.376421 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.376388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0a45005-1762-44a5-a570-e1300c5a70e4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.378187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.378164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0a45005-1762-44a5-a570-e1300c5a70e4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.378292 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.378182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0a45005-1762-44a5-a570-e1300c5a70e4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.387587 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.387532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfcjm\" (UniqueName: \"kubernetes.io/projected/f0a45005-1762-44a5-a570-e1300c5a70e4-kube-api-access-wfcjm\") pod \"prometheus-operator-5676c8c784-jgxs5\" (UID: \"f0a45005-1762-44a5-a570-e1300c5a70e4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:38.460292 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:38.460248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" Apr 17 16:33:44.218792 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.218737 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jgxs5"] Apr 17 16:33:44.221789 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:44.221763 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0a45005_1762_44a5_a570_e1300c5a70e4.slice/crio-d34972bf09b194b92302fbee95b41af70663d60b852dbc42b2c8d9b740c74a04 WatchSource:0}: Error finding container d34972bf09b194b92302fbee95b41af70663d60b852dbc42b2c8d9b740c74a04: Status 404 returned error can't find the container with id d34972bf09b194b92302fbee95b41af70663d60b852dbc42b2c8d9b740c74a04 Apr 17 16:33:44.418121 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.418024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-62lmr" event={"ID":"9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa","Type":"ContainerStarted","Data":"40e1bf509edbc4edf6dbf9fafba1b872c2752b959db19488e13337daeaa493e2"} Apr 17 16:33:44.418428 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.418299 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:44.419612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.419588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" event={"ID":"f0a45005-1762-44a5-a570-e1300c5a70e4","Type":"ContainerStarted","Data":"d34972bf09b194b92302fbee95b41af70663d60b852dbc42b2c8d9b740c74a04"} Apr 17 16:33:44.421109 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.420915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b87c657d-gcmrf" event={"ID":"cfaadc66-808a-432f-b9e6-adfce98dfbac","Type":"ContainerStarted","Data":"0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a"} Apr 17 16:33:44.434119 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.434072 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-62lmr" podStartSLOduration=0.786532129 podStartE2EDuration="17.4340591s" podCreationTimestamp="2026-04-17 16:33:27 +0000 UTC" firstStartedPulling="2026-04-17 16:33:27.501054909 +0000 UTC m=+130.088643857" lastFinishedPulling="2026-04-17 16:33:44.14858188 +0000 UTC m=+146.736170828" observedRunningTime="2026-04-17 16:33:44.433797805 +0000 UTC m=+147.021386776" watchObservedRunningTime="2026-04-17 16:33:44.4340591 +0000 UTC m=+147.021648069" Apr 17 16:33:44.441780 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.441756 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-62lmr" Apr 17 16:33:44.451036 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.450997 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b87c657d-gcmrf" podStartSLOduration=1.221090243 podStartE2EDuration="11.450985779s" podCreationTimestamp="2026-04-17 16:33:33 +0000 UTC" firstStartedPulling="2026-04-17 16:33:33.868861598 +0000 UTC m=+136.456450553" lastFinishedPulling="2026-04-17 16:33:44.098757133 +0000 UTC m=+146.686346089" observedRunningTime="2026-04-17 16:33:44.449430098 +0000 UTC m=+147.037019070" watchObservedRunningTime="2026-04-17 16:33:44.450985779 +0000 UTC m=+147.038574749" Apr 17 16:33:44.554998 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.554935 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b86668c6-g69fd"] Apr 17 16:33:44.558301 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.558277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.566370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.566345 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:33:44.569694 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.569668 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b86668c6-g69fd"] Apr 17 16:33:44.628044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-oauth-serving-cert\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.628223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-config\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.628223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-trusted-ca-bundle\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.628223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-oauth-config\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.628223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9557b\" (UniqueName: \"kubernetes.io/projected/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-kube-api-access-9557b\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.628223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-service-ca\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.628453 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.628389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-serving-cert\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.729808 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.729697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-trusted-ca-bundle\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.729808 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.729795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-oauth-config\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730023 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.729826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9557b\" (UniqueName: \"kubernetes.io/projected/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-kube-api-access-9557b\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730023 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.729855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-service-ca\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730023 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.729937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-serving-cert\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730023 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.729971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-oauth-serving-cert\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730023 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.730009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-config\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730772 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.730695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-service-ca\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730903 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.730859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-oauth-serving-cert\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.730903 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.730881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-trusted-ca-bundle\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.731393 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.731372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-config\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.732768 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.732746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-serving-cert\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.733150 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.733127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-oauth-config\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.739694 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.739669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9557b\" (UniqueName: \"kubernetes.io/projected/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-kube-api-access-9557b\") pod \"console-5b86668c6-g69fd\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:44.871516 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:44.871134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:45.028156 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:45.028074 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b86668c6-g69fd"] Apr 17 16:33:45.034001 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:45.033969 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8ee5e5_e9a4_428c_9671_5e63f3b4a300.slice/crio-d0fb14047e5347d4856587792b6b1167d1f9444ec8c45d5595594fded2be764d WatchSource:0}: Error finding container d0fb14047e5347d4856587792b6b1167d1f9444ec8c45d5595594fded2be764d: Status 404 returned error can't find the container with id d0fb14047e5347d4856587792b6b1167d1f9444ec8c45d5595594fded2be764d Apr 17 16:33:45.426750 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:45.426273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b86668c6-g69fd" event={"ID":"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300","Type":"ContainerStarted","Data":"e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba"} Apr 17 16:33:45.426750 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:45.426322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b86668c6-g69fd" event={"ID":"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300","Type":"ContainerStarted","Data":"d0fb14047e5347d4856587792b6b1167d1f9444ec8c45d5595594fded2be764d"} Apr 17 16:33:45.444641 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:45.444593 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b86668c6-g69fd" podStartSLOduration=1.444577636 podStartE2EDuration="1.444577636s" podCreationTimestamp="2026-04-17 16:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:45.443822492 +0000 UTC m=+148.031411463" watchObservedRunningTime="2026-04-17 16:33:45.444577636 +0000 UTC m=+148.032166607" Apr 17 16:33:46.432460 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:46.432416 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" event={"ID":"f0a45005-1762-44a5-a570-e1300c5a70e4","Type":"ContainerStarted","Data":"e738afb90847c0ca4e9b5c8ad50291726e2394e202f0cb9b9e3888cfb7949ff7"} Apr 17 16:33:46.432460 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:46.432467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" event={"ID":"f0a45005-1762-44a5-a570-e1300c5a70e4","Type":"ContainerStarted","Data":"9175c6ae2c56a6bb8dbd9450363b21594ff43e0d0081eb5a8bf825711d8fa288"} Apr 17 16:33:46.451636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:46.451578 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jgxs5" podStartSLOduration=7.081559633 podStartE2EDuration="8.451563892s" podCreationTimestamp="2026-04-17 16:33:38 +0000 UTC" firstStartedPulling="2026-04-17 16:33:44.224132669 +0000 UTC m=+146.811721617" lastFinishedPulling="2026-04-17 16:33:45.594136925 +0000 UTC m=+148.181725876" observedRunningTime="2026-04-17 16:33:46.450241107 +0000 UTC m=+149.037830077" watchObservedRunningTime="2026-04-17 16:33:46.451563892 +0000 UTC m=+149.039152862" Apr 17 16:33:48.725205 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.725167 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cc2ks"] Apr 17 16:33:48.738346 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.738319 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.741871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.741838 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:33:48.742226 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.742209 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:33:48.742364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.742213 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:33:48.743612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.743587 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bxbxp\"" Apr 17 16:33:48.764862 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.764834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-tls\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.764962 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.764902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-textfile\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.764962 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.764927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-wtmp\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.765075 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.765024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.765131 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.765068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55fedf9-4195-42a7-b7a3-7e91687196ed-metrics-client-ca\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.765131 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.765123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cwdm\" (UniqueName: \"kubernetes.io/projected/f55fedf9-4195-42a7-b7a3-7e91687196ed-kube-api-access-2cwdm\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.765228 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.765154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-sys\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.765228 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.765187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.765319 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.765271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-root\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-textfile\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-wtmp\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55fedf9-4195-42a7-b7a3-7e91687196ed-metrics-client-ca\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cwdm\" (UniqueName: \"kubernetes.io/projected/f55fedf9-4195-42a7-b7a3-7e91687196ed-kube-api-access-2cwdm\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-sys\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-wtmp\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866642 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-sys\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-root\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-tls\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.866953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.866924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f55fedf9-4195-42a7-b7a3-7e91687196ed-root\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.867241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.867217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55fedf9-4195-42a7-b7a3-7e91687196ed-metrics-client-ca\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.867782 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.867703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.869384 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.869359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.869996 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.869952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-tls\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.876382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.876358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f55fedf9-4195-42a7-b7a3-7e91687196ed-node-exporter-textfile\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:48.876535 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:48.876517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cwdm\" (UniqueName: \"kubernetes.io/projected/f55fedf9-4195-42a7-b7a3-7e91687196ed-kube-api-access-2cwdm\") pod \"node-exporter-cc2ks\" (UID: \"f55fedf9-4195-42a7-b7a3-7e91687196ed\") " pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:49.050970 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.050900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cc2ks" Apr 17 16:33:49.062425 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:49.062386 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55fedf9_4195_42a7_b7a3_7e91687196ed.slice/crio-2994e8e4c4504c245cdd21d137327d6df0b6e633c8b2be998fc0bbf6dd16d5e1 WatchSource:0}: Error finding container 2994e8e4c4504c245cdd21d137327d6df0b6e633c8b2be998fc0bbf6dd16d5e1: Status 404 returned error can't find the container with id 2994e8e4c4504c245cdd21d137327d6df0b6e633c8b2be998fc0bbf6dd16d5e1 Apr 17 16:33:49.442371 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.442336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cc2ks" event={"ID":"f55fedf9-4195-42a7-b7a3-7e91687196ed","Type":"ContainerStarted","Data":"2994e8e4c4504c245cdd21d137327d6df0b6e633c8b2be998fc0bbf6dd16d5e1"} Apr 17 16:33:49.776360 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.776269 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:33:49.795492 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.795457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.795735 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.795692 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:33:49.799034 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799011 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-wg4sf\"" Apr 17 16:33:49.799465 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:33:49.799580 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799488 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:33:49.799627 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799488 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:33:49.799754 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:33:49.799811 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799780 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:33:49.799860 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799826 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:33:49.800038 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.799947 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:33:49.800175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.800153 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:33:49.800236 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.800200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:33:49.875151 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-config-volume\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0e7da448-f5ed-4b33-9c73-443939def85b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e7da448-f5ed-4b33-9c73-443939def85b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e7da448-f5ed-4b33-9c73-443939def85b-config-out\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-web-config\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nqs\" (UniqueName: \"kubernetes.io/projected/0e7da448-f5ed-4b33-9c73-443939def85b-kube-api-access-94nqs\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875768 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875768 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e7da448-f5ed-4b33-9c73-443939def85b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.875768 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.875626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e7da448-f5ed-4b33-9c73-443939def85b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.976902 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.976865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.977098 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.976916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e7da448-f5ed-4b33-9c73-443939def85b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.977098 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.976948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e7da448-f5ed-4b33-9c73-443939def85b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.977763 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.977734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e7da448-f5ed-4b33-9c73-443939def85b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.977877 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.977815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-config-volume\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.977877 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.977859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0e7da448-f5ed-4b33-9c73-443939def85b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.977971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.977897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978073 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.977979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978146 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978217 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e7da448-f5ed-4b33-9c73-443939def85b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978274 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e7da448-f5ed-4b33-9c73-443939def85b-config-out\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978274 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e7da448-f5ed-4b33-9c73-443939def85b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978274 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-web-config\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978400 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.978400 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.978320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94nqs\" (UniqueName: \"kubernetes.io/projected/0e7da448-f5ed-4b33-9c73-443939def85b-kube-api-access-94nqs\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.982485 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.982462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-web-config\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.984405 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.984053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e7da448-f5ed-4b33-9c73-443939def85b-config-out\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.984405 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.984142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.984405 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.984366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0e7da448-f5ed-4b33-9c73-443939def85b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.985191 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.985164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.985959 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.985551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.985959 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.985919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.986105 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.985991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e7da448-f5ed-4b33-9c73-443939def85b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.986524 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.986499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-config-volume\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.987298 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.987272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0e7da448-f5ed-4b33-9c73-443939def85b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:49.991043 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:49.991022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nqs\" (UniqueName: \"kubernetes.io/projected/0e7da448-f5ed-4b33-9c73-443939def85b-kube-api-access-94nqs\") pod \"alertmanager-main-0\" (UID: \"0e7da448-f5ed-4b33-9c73-443939def85b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:50.108135 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:50.108098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:33:50.275361 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:50.275327 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:33:50.410834 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:50.409997 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e7da448_f5ed_4b33_9c73_443939def85b.slice/crio-b090c411e11d5b95cafe5e26bbd4bc5a324c9beaa289abe5f7b52a1610a45209 WatchSource:0}: Error finding container b090c411e11d5b95cafe5e26bbd4bc5a324c9beaa289abe5f7b52a1610a45209: Status 404 returned error can't find the container with id b090c411e11d5b95cafe5e26bbd4bc5a324c9beaa289abe5f7b52a1610a45209 Apr 17 16:33:50.446887 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:50.446832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"b090c411e11d5b95cafe5e26bbd4bc5a324c9beaa289abe5f7b52a1610a45209"} Apr 17 16:33:51.453636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:51.453584 2572 generic.go:358] "Generic (PLEG): container finished" podID="f55fedf9-4195-42a7-b7a3-7e91687196ed" containerID="17abb8f44d66341984d98303f247dd876ea601c9e952d686e49ec93fe2dd776e" exitCode=0 Apr 17 16:33:51.454188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:51.453689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cc2ks" event={"ID":"f55fedf9-4195-42a7-b7a3-7e91687196ed","Type":"ContainerDied","Data":"17abb8f44d66341984d98303f247dd876ea601c9e952d686e49ec93fe2dd776e"} Apr 17 16:33:52.459485 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:52.459440 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cc2ks" event={"ID":"f55fedf9-4195-42a7-b7a3-7e91687196ed","Type":"ContainerStarted","Data":"1e4876a75bf683bc737f34aab67c2b49093d3bd68b7d3445aa2445121882ff1f"} Apr 17 16:33:52.459485 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:52.459481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cc2ks" event={"ID":"f55fedf9-4195-42a7-b7a3-7e91687196ed","Type":"ContainerStarted","Data":"95f43a8eec151584991a76eceb83bb7a1e4b943e5eb4ced37d9927661e26369d"} Apr 17 16:33:52.461120 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:52.461093 2572 generic.go:358] "Generic (PLEG): container finished" podID="0e7da448-f5ed-4b33-9c73-443939def85b" containerID="df56d4ad725f7fa865b651a00188a52400654d5905729e231c8b8aa9762a24c9" exitCode=0 Apr 17 16:33:52.461250 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:52.461155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerDied","Data":"df56d4ad725f7fa865b651a00188a52400654d5905729e231c8b8aa9762a24c9"} Apr 17 16:33:52.477221 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:52.477165 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cc2ks" podStartSLOduration=3.108698488 podStartE2EDuration="4.477151714s" podCreationTimestamp="2026-04-17 16:33:48 +0000 UTC" firstStartedPulling="2026-04-17 16:33:49.064518946 +0000 UTC m=+151.652107895" lastFinishedPulling="2026-04-17 16:33:50.432972168 +0000 UTC m=+153.020561121" observedRunningTime="2026-04-17 16:33:52.476364096 +0000 UTC m=+155.063953067" watchObservedRunningTime="2026-04-17 16:33:52.477151714 +0000 UTC m=+155.064740684" Apr 17 16:33:53.719683 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:53.719635 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:53.719683 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:53.719678 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:53.725038 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:53.725009 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:53.786358 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:53.786319 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4nqbv" podUID="7d470685-9573-40d7-b32c-929ed88cc56d" Apr 17 16:33:53.792539 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:33:53.792497 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gvlnm" podUID="d60e97bd-f20c-497d-ae2a-6dac86b93c77" Apr 17 16:33:54.468130 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:54.468095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:33:54.468376 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:54.468342 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4nqbv" Apr 17 16:33:54.471971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:54.471946 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:33:54.872379 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:54.872076 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:54.872837 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:54.872819 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:54.879055 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:54.879034 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:55.368401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.368364 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b86668c6-g69fd"] Apr 17 16:33:55.400301 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.400262 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69679fdccf-xcdxf"] Apr 17 16:33:55.424421 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.424396 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69679fdccf-xcdxf"] Apr 17 16:33:55.424572 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.424533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.473328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.473297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"84c7f21b32b348ff10d05315e8f9279066672755acae5e3d7ad9505b57c48c61"} Apr 17 16:33:55.473328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.473328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"469590f2ec868fabd383fb05fc34a35f512010afd49e4bea34c9141721870655"} Apr 17 16:33:55.473494 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.473338 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"03d574588e90dcb08c455e3b523a3b10122a587ed6e9e2ef1b609ba15a9e4673"} Apr 17 16:33:55.473494 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.473346 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"c56401a75f8f0a5a89373c3d37855c7916e1dd0badd6b449829d95a8ac19b563"} Apr 17 16:33:55.473494 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.473354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"24a013c6a19fefa83e6f2e41811ed0129c83f1bb0e8bd87c0b4b13734554b95d"} Apr 17 16:33:55.477347 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.477328 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:33:55.534288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-console-config\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.534288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-oauth-serving-cert\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.534288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm92n\" (UniqueName: \"kubernetes.io/projected/133166db-7f9a-4de8-aca7-783d136f8480-kube-api-access-lm92n\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.534538 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-oauth-config\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.534538 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-trusted-ca-bundle\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.534686 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-serving-cert\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.534911 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.534821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-service-ca\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636191 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-serving-cert\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636191 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-service-ca\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636416 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-console-config\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636416 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-oauth-serving-cert\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636416 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm92n\" (UniqueName: \"kubernetes.io/projected/133166db-7f9a-4de8-aca7-783d136f8480-kube-api-access-lm92n\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636416 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-oauth-config\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.636416 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-trusted-ca-bundle\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.637024 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.636999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-service-ca\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.637121 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.637081 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-oauth-serving-cert\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.637234 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.637218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-trusted-ca-bundle\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.637345 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.637322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-console-config\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.638743 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.638689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-serving-cert\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.638857 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.638836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-oauth-config\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.659441 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.659419 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm92n\" (UniqueName: \"kubernetes.io/projected/133166db-7f9a-4de8-aca7-783d136f8480-kube-api-access-lm92n\") pod \"console-69679fdccf-xcdxf\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.734554 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.734528 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:33:55.885081 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:55.885027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69679fdccf-xcdxf"] Apr 17 16:33:55.888253 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:55.888226 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133166db_7f9a_4de8_aca7_783d136f8480.slice/crio-d24491593e23cb660c1458e027972fec774999cde3314f941c3af4f566923731 WatchSource:0}: Error finding container d24491593e23cb660c1458e027972fec774999cde3314f941c3af4f566923731: Status 404 returned error can't find the container with id d24491593e23cb660c1458e027972fec774999cde3314f941c3af4f566923731 Apr 17 16:33:56.477817 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:56.477779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69679fdccf-xcdxf" event={"ID":"133166db-7f9a-4de8-aca7-783d136f8480","Type":"ContainerStarted","Data":"84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9"} Apr 17 16:33:56.477817 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:56.477817 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69679fdccf-xcdxf" event={"ID":"133166db-7f9a-4de8-aca7-783d136f8480","Type":"ContainerStarted","Data":"d24491593e23cb660c1458e027972fec774999cde3314f941c3af4f566923731"} Apr 17 16:33:56.498379 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:56.498336 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69679fdccf-xcdxf" podStartSLOduration=1.498323417 podStartE2EDuration="1.498323417s" podCreationTimestamp="2026-04-17 16:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:56.497555719 +0000 UTC m=+159.085144688" watchObservedRunningTime="2026-04-17 16:33:56.498323417 +0000 UTC m=+159.085912387" Apr 17 16:33:57.483162 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:57.483131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0e7da448-f5ed-4b33-9c73-443939def85b","Type":"ContainerStarted","Data":"97ea2bf7e2fd03fc15872aedf168c121caec5d95309b74c3971d68842484d6fc"} Apr 17 16:33:57.511418 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:57.511374 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.383442931 podStartE2EDuration="8.511362373s" podCreationTimestamp="2026-04-17 16:33:49 +0000 UTC" firstStartedPulling="2026-04-17 16:33:50.429090017 +0000 UTC m=+153.016678971" lastFinishedPulling="2026-04-17 16:33:56.557009461 +0000 UTC m=+159.144598413" observedRunningTime="2026-04-17 16:33:57.51024113 +0000 UTC m=+160.097830100" watchObservedRunningTime="2026-04-17 16:33:57.511362373 +0000 UTC m=+160.098951343" Apr 17 16:33:58.661282 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.661246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:33:58.661740 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.661330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:33:58.663765 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.663744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d470685-9573-40d7-b32c-929ed88cc56d-metrics-tls\") pod \"dns-default-4nqbv\" (UID: \"7d470685-9573-40d7-b32c-929ed88cc56d\") " pod="openshift-dns/dns-default-4nqbv" Apr 17 16:33:58.663867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.663848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d60e97bd-f20c-497d-ae2a-6dac86b93c77-cert\") pod \"ingress-canary-gvlnm\" (UID: \"d60e97bd-f20c-497d-ae2a-6dac86b93c77\") " pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:33:58.670892 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.670871 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qsxk4\"" Apr 17 16:33:58.671518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.671504 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-78gjs\"" Apr 17 16:33:58.679081 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.679060 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gvlnm" Apr 17 16:33:58.679081 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.679078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4nqbv" Apr 17 16:33:58.805987 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.805879 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4nqbv"] Apr 17 16:33:58.808517 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:58.808491 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d470685_9573_40d7_b32c_929ed88cc56d.slice/crio-895325e36fa837d0823f10f4136ecc1fad92b6f8573bd4d9921bc601f8bb2ccc WatchSource:0}: Error finding container 895325e36fa837d0823f10f4136ecc1fad92b6f8573bd4d9921bc601f8bb2ccc: Status 404 returned error can't find the container with id 895325e36fa837d0823f10f4136ecc1fad92b6f8573bd4d9921bc601f8bb2ccc Apr 17 16:33:58.824018 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:58.823997 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gvlnm"] Apr 17 16:33:58.826108 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:33:58.826086 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60e97bd_f20c_497d_ae2a_6dac86b93c77.slice/crio-41dea299acf11b8a5ae549eddaa33589efa36412a93b6df528e7e6e7a0089c74 WatchSource:0}: Error finding container 41dea299acf11b8a5ae549eddaa33589efa36412a93b6df528e7e6e7a0089c74: Status 404 returned error can't find the container with id 41dea299acf11b8a5ae549eddaa33589efa36412a93b6df528e7e6e7a0089c74 Apr 17 16:33:59.490436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:59.490394 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4nqbv" event={"ID":"7d470685-9573-40d7-b32c-929ed88cc56d","Type":"ContainerStarted","Data":"895325e36fa837d0823f10f4136ecc1fad92b6f8573bd4d9921bc601f8bb2ccc"} Apr 17 16:33:59.491638 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:33:59.491603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gvlnm" event={"ID":"d60e97bd-f20c-497d-ae2a-6dac86b93c77","Type":"ContainerStarted","Data":"41dea299acf11b8a5ae549eddaa33589efa36412a93b6df528e7e6e7a0089c74"} Apr 17 16:34:01.500561 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:01.500533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gvlnm" event={"ID":"d60e97bd-f20c-497d-ae2a-6dac86b93c77","Type":"ContainerStarted","Data":"d1958d7d15ad9d9624a7e748caae8fda005539fb6b4901709a88f75531784c07"} Apr 17 16:34:02.504647 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:02.504592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4nqbv" event={"ID":"7d470685-9573-40d7-b32c-929ed88cc56d","Type":"ContainerStarted","Data":"d02d085521f3e3410c1f20ee1dea96b223fc294abbe74b85c2b28de1357f999b"} Apr 17 16:34:02.504647 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:02.504651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4nqbv" event={"ID":"7d470685-9573-40d7-b32c-929ed88cc56d","Type":"ContainerStarted","Data":"fc67959272ffcd4832680cc89aec1c55d189b2313b10f32fba9cdfe2ef956dea"} Apr 17 16:34:02.519136 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:02.519089 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gvlnm" podStartSLOduration=129.981792233 podStartE2EDuration="2m12.519077602s" podCreationTimestamp="2026-04-17 16:31:50 +0000 UTC" firstStartedPulling="2026-04-17 16:33:58.827752753 +0000 UTC m=+161.415341700" lastFinishedPulling="2026-04-17 16:34:01.365038117 +0000 UTC m=+163.952627069" observedRunningTime="2026-04-17 16:34:02.517818718 +0000 UTC m=+165.105407688" watchObservedRunningTime="2026-04-17 16:34:02.519077602 +0000 UTC m=+165.106666572" Apr 17 16:34:02.533806 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:02.533763 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4nqbv" podStartSLOduration=129.98369087 podStartE2EDuration="2m12.533748489s" podCreationTimestamp="2026-04-17 16:31:50 +0000 UTC" firstStartedPulling="2026-04-17 16:33:58.810466151 +0000 UTC m=+161.398055099" lastFinishedPulling="2026-04-17 16:34:01.360523756 +0000 UTC m=+163.948112718" observedRunningTime="2026-04-17 16:34:02.532766352 +0000 UTC m=+165.120355322" watchObservedRunningTime="2026-04-17 16:34:02.533748489 +0000 UTC m=+165.121337454" Apr 17 16:34:03.507620 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:03.507589 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4nqbv" Apr 17 16:34:05.735543 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:05.735454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:34:05.735957 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:05.735537 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:34:05.741000 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:05.740978 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:34:06.520333 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:06.520295 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:34:06.590307 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:06.590281 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b87c657d-gcmrf"] Apr 17 16:34:10.533336 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:10.533301 2572 generic.go:358] "Generic (PLEG): container finished" podID="cb478791-e3d5-4b73-803c-4c43377c9ebc" containerID="48ab67fcc7161ac7bd8c0ceb076dc351145abfa962dc0c1f814da1caf4104eb3" exitCode=0 Apr 17 16:34:10.533734 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:10.533344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" event={"ID":"cb478791-e3d5-4b73-803c-4c43377c9ebc","Type":"ContainerDied","Data":"48ab67fcc7161ac7bd8c0ceb076dc351145abfa962dc0c1f814da1caf4104eb3"} Apr 17 16:34:10.533734 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:10.533634 2572 scope.go:117] "RemoveContainer" containerID="48ab67fcc7161ac7bd8c0ceb076dc351145abfa962dc0c1f814da1caf4104eb3" Apr 17 16:34:11.538411 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:11.538381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l6mkp" event={"ID":"cb478791-e3d5-4b73-803c-4c43377c9ebc","Type":"ContainerStarted","Data":"e11f00e842f87feb0712fe1f0a18bdc875705e4788ffd04005d665a84c6358ee"} Apr 17 16:34:13.513796 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:13.513767 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4nqbv" Apr 17 16:34:15.550795 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:15.550760 2572 generic.go:358] "Generic (PLEG): container finished" podID="f71c20f2-eb49-4550-bc66-4d1973c1fdc8" containerID="727c08970f719065c0f9c93acd41a9e61c1ee6f4a2fa4da679fac62ff400a14e" exitCode=0 Apr 17 16:34:15.550795 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:15.550797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" event={"ID":"f71c20f2-eb49-4550-bc66-4d1973c1fdc8","Type":"ContainerDied","Data":"727c08970f719065c0f9c93acd41a9e61c1ee6f4a2fa4da679fac62ff400a14e"} Apr 17 16:34:15.551229 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:15.551081 2572 scope.go:117] "RemoveContainer" containerID="727c08970f719065c0f9c93acd41a9e61c1ee6f4a2fa4da679fac62ff400a14e" Apr 17 16:34:16.555364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:16.555328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cm9d7" event={"ID":"f71c20f2-eb49-4550-bc66-4d1973c1fdc8","Type":"ContainerStarted","Data":"b48de1f0369facd04da97f020952bb87726ec358dce6e5b446d1e09e80baabbc"} Apr 17 16:34:21.497006 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.496941 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b86668c6-g69fd" podUID="bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" containerName="console" containerID="cri-o://e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba" gracePeriod=15 Apr 17 16:34:21.743057 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.743037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b86668c6-g69fd_bb8ee5e5-e9a4-428c-9671-5e63f3b4a300/console/0.log" Apr 17 16:34:21.743154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.743108 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:34:21.862081 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862045 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9557b\" (UniqueName: \"kubernetes.io/projected/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-kube-api-access-9557b\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862094 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-oauth-serving-cert\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862149 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-oauth-config\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862189 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-serving-cert\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862221 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-trusted-ca-bundle\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862437 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862285 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-service-ca\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862437 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862311 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-config\") pod \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\" (UID: \"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300\") " Apr 17 16:34:21.862620 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862585 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:21.862768 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.862703 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:21.863037 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.863009 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-service-ca" (OuterVolumeSpecName: "service-ca") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:21.863089 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.863037 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-config" (OuterVolumeSpecName: "console-config") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:21.864592 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.864570 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:21.864667 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.864595 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:21.864667 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.864570 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-kube-api-access-9557b" (OuterVolumeSpecName: "kube-api-access-9557b") pod "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" (UID: "bb8ee5e5-e9a4-428c-9671-5e63f3b4a300"). InnerVolumeSpecName "kube-api-access-9557b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:21.963150 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963120 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9557b\" (UniqueName: \"kubernetes.io/projected/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-kube-api-access-9557b\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:21.963150 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963145 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-oauth-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:21.963150 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963154 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-oauth-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:21.963350 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963163 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:21.963350 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963172 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-trusted-ca-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:21.963350 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963181 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-service-ca\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:21.963350 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:21.963189 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300-console-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:22.572832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.572804 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b86668c6-g69fd_bb8ee5e5-e9a4-428c-9671-5e63f3b4a300/console/0.log" Apr 17 16:34:22.573241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.572842 2572 generic.go:358] "Generic (PLEG): container finished" podID="bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" containerID="e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba" exitCode=2 Apr 17 16:34:22.573241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.572877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b86668c6-g69fd" event={"ID":"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300","Type":"ContainerDied","Data":"e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba"} Apr 17 16:34:22.573241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.572899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b86668c6-g69fd" event={"ID":"bb8ee5e5-e9a4-428c-9671-5e63f3b4a300","Type":"ContainerDied","Data":"d0fb14047e5347d4856587792b6b1167d1f9444ec8c45d5595594fded2be764d"} Apr 17 16:34:22.573241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.572906 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b86668c6-g69fd" Apr 17 16:34:22.573241 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.572913 2572 scope.go:117] "RemoveContainer" containerID="e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba" Apr 17 16:34:22.580831 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.580807 2572 scope.go:117] "RemoveContainer" containerID="e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba" Apr 17 16:34:22.581102 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:34:22.581085 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba\": container with ID starting with e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba not found: ID does not exist" containerID="e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba" Apr 17 16:34:22.581149 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.581112 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba"} err="failed to get container status \"e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba\": rpc error: code = NotFound desc = could not find container \"e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba\": container with ID starting with e62bdd4d598c2433124122ee4c8306b191fb64aa735480dcf3d26bfd0c02a0ba not found: ID does not exist" Apr 17 16:34:22.588012 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.587992 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b86668c6-g69fd"] Apr 17 16:34:22.594277 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:22.594257 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b86668c6-g69fd"] Apr 17 16:34:23.973048 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:23.973016 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" path="/var/lib/kubelet/pods/bb8ee5e5-e9a4-428c-9671-5e63f3b4a300/volumes" Apr 17 16:34:31.613318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.613247 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69b87c657d-gcmrf" podUID="cfaadc66-808a-432f-b9e6-adfce98dfbac" containerName="console" containerID="cri-o://0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a" gracePeriod=15 Apr 17 16:34:31.871275 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.871217 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b87c657d-gcmrf_cfaadc66-808a-432f-b9e6-adfce98dfbac/console/0.log" Apr 17 16:34:31.871384 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.871278 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:34:31.941531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.941489 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-oauth-config\") pod \"cfaadc66-808a-432f-b9e6-adfce98dfbac\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " Apr 17 16:34:31.941696 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.941539 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-service-ca\") pod \"cfaadc66-808a-432f-b9e6-adfce98dfbac\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " Apr 17 16:34:31.941696 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.941581 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vzz9\" (UniqueName: \"kubernetes.io/projected/cfaadc66-808a-432f-b9e6-adfce98dfbac-kube-api-access-7vzz9\") pod \"cfaadc66-808a-432f-b9e6-adfce98dfbac\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " Apr 17 16:34:31.941696 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.941605 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-serving-cert\") pod \"cfaadc66-808a-432f-b9e6-adfce98dfbac\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " Apr 17 16:34:31.941696 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.941630 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-config\") pod \"cfaadc66-808a-432f-b9e6-adfce98dfbac\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " Apr 17 16:34:31.941696 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.941663 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-oauth-serving-cert\") pod \"cfaadc66-808a-432f-b9e6-adfce98dfbac\" (UID: \"cfaadc66-808a-432f-b9e6-adfce98dfbac\") " Apr 17 16:34:31.942075 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.942032 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-config" (OuterVolumeSpecName: "console-config") pod "cfaadc66-808a-432f-b9e6-adfce98dfbac" (UID: "cfaadc66-808a-432f-b9e6-adfce98dfbac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:31.942210 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.942180 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-service-ca" (OuterVolumeSpecName: "service-ca") pod "cfaadc66-808a-432f-b9e6-adfce98dfbac" (UID: "cfaadc66-808a-432f-b9e6-adfce98dfbac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:31.942210 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.942185 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cfaadc66-808a-432f-b9e6-adfce98dfbac" (UID: "cfaadc66-808a-432f-b9e6-adfce98dfbac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:31.943890 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.943859 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cfaadc66-808a-432f-b9e6-adfce98dfbac" (UID: "cfaadc66-808a-432f-b9e6-adfce98dfbac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:31.943890 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.943878 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cfaadc66-808a-432f-b9e6-adfce98dfbac" (UID: "cfaadc66-808a-432f-b9e6-adfce98dfbac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:31.944007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:31.943929 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfaadc66-808a-432f-b9e6-adfce98dfbac-kube-api-access-7vzz9" (OuterVolumeSpecName: "kube-api-access-7vzz9") pod "cfaadc66-808a-432f-b9e6-adfce98dfbac" (UID: "cfaadc66-808a-432f-b9e6-adfce98dfbac"). InnerVolumeSpecName "kube-api-access-7vzz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:32.042227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.042206 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-oauth-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.042227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.042229 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-service-ca\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.042368 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.042239 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vzz9\" (UniqueName: \"kubernetes.io/projected/cfaadc66-808a-432f-b9e6-adfce98dfbac-kube-api-access-7vzz9\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.042368 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.042254 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.042368 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.042266 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-console-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.042368 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.042274 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaadc66-808a-432f-b9e6-adfce98dfbac-oauth-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.613032 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.612998 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b87c657d-gcmrf_cfaadc66-808a-432f-b9e6-adfce98dfbac/console/0.log" Apr 17 16:34:32.613197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.613036 2572 generic.go:358] "Generic (PLEG): container finished" podID="cfaadc66-808a-432f-b9e6-adfce98dfbac" containerID="0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a" exitCode=2 Apr 17 16:34:32.613197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.613117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b87c657d-gcmrf" event={"ID":"cfaadc66-808a-432f-b9e6-adfce98dfbac","Type":"ContainerDied","Data":"0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a"} Apr 17 16:34:32.613197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.613138 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b87c657d-gcmrf" event={"ID":"cfaadc66-808a-432f-b9e6-adfce98dfbac","Type":"ContainerDied","Data":"cf1d122e2302c5a6e8aec945f8eca84585cea6321d6b058fee6d8a44b1a5167f"} Apr 17 16:34:32.613197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.613151 2572 scope.go:117] "RemoveContainer" containerID="0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a" Apr 17 16:34:32.613197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.613118 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b87c657d-gcmrf" Apr 17 16:34:32.621338 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.621316 2572 scope.go:117] "RemoveContainer" containerID="0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a" Apr 17 16:34:32.621588 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:34:32.621569 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a\": container with ID starting with 0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a not found: ID does not exist" containerID="0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a" Apr 17 16:34:32.621633 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.621599 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a"} err="failed to get container status \"0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a\": rpc error: code = NotFound desc = could not find container \"0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a\": container with ID starting with 0a4e0a77095cfdec20deed2ce441e87db693b999186dbcbef19fe2ff8581ab4a not found: ID does not exist" Apr 17 16:34:32.628197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.628178 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b87c657d-gcmrf"] Apr 17 16:34:32.631949 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:32.631931 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69b87c657d-gcmrf"] Apr 17 16:34:33.972537 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:34:33.972504 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfaadc66-808a-432f-b9e6-adfce98dfbac" path="/var/lib/kubelet/pods/cfaadc66-808a-432f-b9e6-adfce98dfbac/volumes" Apr 17 16:35:13.005368 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.005333 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt"] Apr 17 16:35:13.006418 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.006388 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" containerName="console" Apr 17 16:35:13.006418 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.006416 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" containerName="console" Apr 17 16:35:13.006577 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.006441 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfaadc66-808a-432f-b9e6-adfce98dfbac" containerName="console" Apr 17 16:35:13.006577 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.006449 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaadc66-808a-432f-b9e6-adfce98dfbac" containerName="console" Apr 17 16:35:13.006577 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.006527 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfaadc66-808a-432f-b9e6-adfce98dfbac" containerName="console" Apr 17 16:35:13.006577 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.006543 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb8ee5e5-e9a4-428c-9671-5e63f3b4a300" containerName="console" Apr 17 16:35:13.009691 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.009673 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.013247 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.013212 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 16:35:13.013360 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.013253 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 16:35:13.013631 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.013266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 16:35:13.013891 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.013861 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-bgpd6\"" Apr 17 16:35:13.013981 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.013896 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 16:35:13.014483 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.014458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 16:35:13.022706 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.022680 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt"] Apr 17 16:35:13.024036 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.023617 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 16:35:13.147031 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.146996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-secret-telemeter-client\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147031 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-telemeter-client-tls\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5q9\" (UniqueName: \"kubernetes.io/projected/22ec0cf7-7086-4276-80b3-231844c0a7a5-kube-api-access-qc5q9\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-metrics-client-ca\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-serving-certs-ca-bundle\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-federate-client-tls\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.147413 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.147288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.247768 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.247733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.247894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.247776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5q9\" (UniqueName: \"kubernetes.io/projected/22ec0cf7-7086-4276-80b3-231844c0a7a5-kube-api-access-qc5q9\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.247894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.247800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-metrics-client-ca\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.247894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.247823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-serving-certs-ca-bundle\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.247894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.247838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-federate-client-tls\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.247894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.247867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.248183 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.248020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-secret-telemeter-client\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.248183 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.248148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-telemeter-client-tls\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.248607 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.248586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-serving-certs-ca-bundle\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.248710 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.248690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-metrics-client-ca\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.248977 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.248955 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22ec0cf7-7086-4276-80b3-231844c0a7a5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.250623 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.250599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-federate-client-tls\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.250710 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.250653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-telemeter-client-tls\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.251152 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.251130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.251219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.251141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/22ec0cf7-7086-4276-80b3-231844c0a7a5-secret-telemeter-client\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.256651 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.256600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5q9\" (UniqueName: \"kubernetes.io/projected/22ec0cf7-7086-4276-80b3-231844c0a7a5-kube-api-access-qc5q9\") pod \"telemeter-client-5fcbc8f5f4-cnkrt\" (UID: \"22ec0cf7-7086-4276-80b3-231844c0a7a5\") " pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.326656 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.326627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" Apr 17 16:35:13.452118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.452093 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt"] Apr 17 16:35:13.454692 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:35:13.454656 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ec0cf7_7086_4276_80b3_231844c0a7a5.slice/crio-790411c1c3ac70c665c4b6a1fe38ddd3f7d4d5ab388f81221a2246cd9fd0257f WatchSource:0}: Error finding container 790411c1c3ac70c665c4b6a1fe38ddd3f7d4d5ab388f81221a2246cd9fd0257f: Status 404 returned error can't find the container with id 790411c1c3ac70c665c4b6a1fe38ddd3f7d4d5ab388f81221a2246cd9fd0257f Apr 17 16:35:13.730744 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:13.730693 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" event={"ID":"22ec0cf7-7086-4276-80b3-231844c0a7a5","Type":"ContainerStarted","Data":"790411c1c3ac70c665c4b6a1fe38ddd3f7d4d5ab388f81221a2246cd9fd0257f"} Apr 17 16:35:15.740832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:15.740798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" event={"ID":"22ec0cf7-7086-4276-80b3-231844c0a7a5","Type":"ContainerStarted","Data":"1822be73a684984dcf7b1bcff9161873ed94183f2d51dd880e26fcb452f31644"} Apr 17 16:35:15.740832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:15.740833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" event={"ID":"22ec0cf7-7086-4276-80b3-231844c0a7a5","Type":"ContainerStarted","Data":"ea659b80d365bf9590ade818b7a3b858163ac508fd81573c532dcdec8aa9ea0a"} Apr 17 16:35:15.741227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:15.740843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" event={"ID":"22ec0cf7-7086-4276-80b3-231844c0a7a5","Type":"ContainerStarted","Data":"3952124c0c8b88df6ab929c4da023b6d0a400f4e256d6d5de61588d761570282"} Apr 17 16:35:15.762432 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:15.762374 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5fcbc8f5f4-cnkrt" podStartSLOduration=2.016863565 podStartE2EDuration="3.762356322s" podCreationTimestamp="2026-04-17 16:35:12 +0000 UTC" firstStartedPulling="2026-04-17 16:35:13.456422002 +0000 UTC m=+236.044010950" lastFinishedPulling="2026-04-17 16:35:15.201914756 +0000 UTC m=+237.789503707" observedRunningTime="2026-04-17 16:35:15.761441582 +0000 UTC m=+238.349030577" watchObservedRunningTime="2026-04-17 16:35:15.762356322 +0000 UTC m=+238.349945290" Apr 17 16:35:16.432737 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.432677 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59fc7c8489-m2gx8"] Apr 17 16:35:16.436307 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.436272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.447224 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.447197 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59fc7c8489-m2gx8"] Apr 17 16:35:16.577135 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzd5\" (UniqueName: \"kubernetes.io/projected/245ec869-f2a1-4769-8d64-d2bd9e0bca15-kube-api-access-qkzd5\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.577135 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-service-ca\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.577343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577216 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-oauth-serving-cert\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.577343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-config\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.577343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-trusted-ca-bundle\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.577343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-oauth-config\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.577343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.577344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-serving-cert\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678390 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzd5\" (UniqueName: \"kubernetes.io/projected/245ec869-f2a1-4769-8d64-d2bd9e0bca15-kube-api-access-qkzd5\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678552 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-service-ca\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678552 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-oauth-serving-cert\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678552 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-config\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678552 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-trusted-ca-bundle\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678552 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-oauth-config\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.678825 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.678685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-serving-cert\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.679360 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.679266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-config\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.679360 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.679284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-service-ca\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.679360 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.679289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-oauth-serving-cert\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.679631 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.679608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-trusted-ca-bundle\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.681037 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.681020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-oauth-config\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.681286 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.681266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-serving-cert\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.686211 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.686157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzd5\" (UniqueName: \"kubernetes.io/projected/245ec869-f2a1-4769-8d64-d2bd9e0bca15-kube-api-access-qkzd5\") pod \"console-59fc7c8489-m2gx8\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.745880 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.745857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:16.887336 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:16.887272 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59fc7c8489-m2gx8"] Apr 17 16:35:16.889602 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:35:16.889570 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245ec869_f2a1_4769_8d64_d2bd9e0bca15.slice/crio-82065e4ba8904ded28fa39fb685d9e0559fd14c428f13f7e81e5a13e7fad0bfa WatchSource:0}: Error finding container 82065e4ba8904ded28fa39fb685d9e0559fd14c428f13f7e81e5a13e7fad0bfa: Status 404 returned error can't find the container with id 82065e4ba8904ded28fa39fb685d9e0559fd14c428f13f7e81e5a13e7fad0bfa Apr 17 16:35:17.747660 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:17.747620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fc7c8489-m2gx8" event={"ID":"245ec869-f2a1-4769-8d64-d2bd9e0bca15","Type":"ContainerStarted","Data":"3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7"} Apr 17 16:35:17.747660 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:17.747663 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fc7c8489-m2gx8" event={"ID":"245ec869-f2a1-4769-8d64-d2bd9e0bca15","Type":"ContainerStarted","Data":"82065e4ba8904ded28fa39fb685d9e0559fd14c428f13f7e81e5a13e7fad0bfa"} Apr 17 16:35:17.764564 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:17.764527 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59fc7c8489-m2gx8" podStartSLOduration=1.7645156069999999 podStartE2EDuration="1.764515607s" podCreationTimestamp="2026-04-17 16:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:35:17.764438113 +0000 UTC m=+240.352027084" watchObservedRunningTime="2026-04-17 16:35:17.764515607 +0000 UTC m=+240.352104577" Apr 17 16:35:26.746538 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.746500 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:26.746538 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.746545 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:26.751531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.751511 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:26.776962 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.776941 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:35:26.825610 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.825580 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69679fdccf-xcdxf"] Apr 17 16:35:26.941257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.941218 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8546bb5cc6-5hpzq"] Apr 17 16:35:26.946647 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.946623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:26.952693 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:26.952658 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8546bb5cc6-5hpzq"] Apr 17 16:35:27.064777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-serving-cert\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.064777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-config\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.064963 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-trusted-ca-bundle\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.064963 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsrrs\" (UniqueName: \"kubernetes.io/projected/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-kube-api-access-nsrrs\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.064963 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-oauth-config\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.064963 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-service-ca\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.064963 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.064932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-oauth-serving-cert\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165298 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-config\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-trusted-ca-bundle\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsrrs\" (UniqueName: \"kubernetes.io/projected/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-kube-api-access-nsrrs\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-oauth-config\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-service-ca\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-oauth-serving-cert\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.165657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.165591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-serving-cert\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.166112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.166088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-config\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.166332 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.166310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-trusted-ca-bundle\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.166436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.166310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-oauth-serving-cert\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.166436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.166378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-service-ca\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.167964 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.167946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-oauth-config\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.168098 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.168082 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-serving-cert\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.172594 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.172571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsrrs\" (UniqueName: \"kubernetes.io/projected/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-kube-api-access-nsrrs\") pod \"console-8546bb5cc6-5hpzq\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.257935 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.257904 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:27.380365 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.380342 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8546bb5cc6-5hpzq"] Apr 17 16:35:27.382529 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:35:27.382499 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bf6cc0_f391_4da0_8c81_f7bf071794e4.slice/crio-c896ab01f2188e2277072777f3e91afcc08f251be851112668766015aa9a4fc7 WatchSource:0}: Error finding container c896ab01f2188e2277072777f3e91afcc08f251be851112668766015aa9a4fc7: Status 404 returned error can't find the container with id c896ab01f2188e2277072777f3e91afcc08f251be851112668766015aa9a4fc7 Apr 17 16:35:27.777193 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.777160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8546bb5cc6-5hpzq" event={"ID":"f4bf6cc0-f391-4da0-8c81-f7bf071794e4","Type":"ContainerStarted","Data":"048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105"} Apr 17 16:35:27.777193 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.777196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8546bb5cc6-5hpzq" event={"ID":"f4bf6cc0-f391-4da0-8c81-f7bf071794e4","Type":"ContainerStarted","Data":"c896ab01f2188e2277072777f3e91afcc08f251be851112668766015aa9a4fc7"} Apr 17 16:35:27.797133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:27.797090 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8546bb5cc6-5hpzq" podStartSLOduration=1.797077801 podStartE2EDuration="1.797077801s" podCreationTimestamp="2026-04-17 16:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:35:27.795827949 +0000 UTC m=+250.383416920" watchObservedRunningTime="2026-04-17 16:35:27.797077801 +0000 UTC m=+250.384666771" Apr 17 16:35:37.258151 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:37.258061 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:37.258151 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:37.258105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:37.262688 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:37.262664 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:37.808754 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:37.808709 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:35:37.898948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:37.898921 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59fc7c8489-m2gx8"] Apr 17 16:35:51.846940 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:51.846897 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69679fdccf-xcdxf" podUID="133166db-7f9a-4de8-aca7-783d136f8480" containerName="console" containerID="cri-o://84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9" gracePeriod=15 Apr 17 16:35:52.083039 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.083018 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69679fdccf-xcdxf_133166db-7f9a-4de8-aca7-783d136f8480/console/0.log" Apr 17 16:35:52.083146 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.083085 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:35:52.154811 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.154747 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-oauth-serving-cert\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.154811 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.154786 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-service-ca\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.154811 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.154813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-oauth-config\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.155047 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.154929 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-trusted-ca-bundle\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.155047 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155010 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-serving-cert\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.155154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155047 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-console-config\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.155154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155083 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm92n\" (UniqueName: \"kubernetes.io/projected/133166db-7f9a-4de8-aca7-783d136f8480-kube-api-access-lm92n\") pod \"133166db-7f9a-4de8-aca7-783d136f8480\" (UID: \"133166db-7f9a-4de8-aca7-783d136f8480\") " Apr 17 16:35:52.155244 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155200 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-service-ca" (OuterVolumeSpecName: "service-ca") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:52.155244 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155207 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:52.155352 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155324 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:52.155454 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155421 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-console-config" (OuterVolumeSpecName: "console-config") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:52.155529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155462 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-oauth-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.155529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155480 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-service-ca\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.155529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.155490 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-trusted-ca-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.157058 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.157035 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:52.157198 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.157178 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:52.157263 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.157244 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133166db-7f9a-4de8-aca7-783d136f8480-kube-api-access-lm92n" (OuterVolumeSpecName: "kube-api-access-lm92n") pod "133166db-7f9a-4de8-aca7-783d136f8480" (UID: "133166db-7f9a-4de8-aca7-783d136f8480"). InnerVolumeSpecName "kube-api-access-lm92n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:52.255871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.255845 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-oauth-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.255871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.255868 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/133166db-7f9a-4de8-aca7-783d136f8480-console-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.256035 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.255880 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/133166db-7f9a-4de8-aca7-783d136f8480-console-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.256035 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.255889 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lm92n\" (UniqueName: \"kubernetes.io/projected/133166db-7f9a-4de8-aca7-783d136f8480-kube-api-access-lm92n\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:35:52.851064 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.851031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69679fdccf-xcdxf_133166db-7f9a-4de8-aca7-783d136f8480/console/0.log" Apr 17 16:35:52.851502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.851075 2572 generic.go:358] "Generic (PLEG): container finished" podID="133166db-7f9a-4de8-aca7-783d136f8480" containerID="84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9" exitCode=2 Apr 17 16:35:52.851502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.851130 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69679fdccf-xcdxf" event={"ID":"133166db-7f9a-4de8-aca7-783d136f8480","Type":"ContainerDied","Data":"84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9"} Apr 17 16:35:52.851502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.851143 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69679fdccf-xcdxf" Apr 17 16:35:52.851502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.851162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69679fdccf-xcdxf" event={"ID":"133166db-7f9a-4de8-aca7-783d136f8480","Type":"ContainerDied","Data":"d24491593e23cb660c1458e027972fec774999cde3314f941c3af4f566923731"} Apr 17 16:35:52.851502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.851183 2572 scope.go:117] "RemoveContainer" containerID="84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9" Apr 17 16:35:52.859855 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.859835 2572 scope.go:117] "RemoveContainer" containerID="84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9" Apr 17 16:35:52.860108 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:35:52.860064 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9\": container with ID starting with 84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9 not found: ID does not exist" containerID="84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9" Apr 17 16:35:52.860159 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.860117 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9"} err="failed to get container status \"84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9\": rpc error: code = NotFound desc = could not find container \"84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9\": container with ID starting with 84d959831c2e0c4c42905fefa81ad8830e81eb963bece5e5ba68fa1a251d5ed9 not found: ID does not exist" Apr 17 16:35:52.871896 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.871876 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69679fdccf-xcdxf"] Apr 17 16:35:52.877608 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:52.877584 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69679fdccf-xcdxf"] Apr 17 16:35:53.973133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:35:53.973094 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133166db-7f9a-4de8-aca7-783d136f8480" path="/var/lib/kubelet/pods/133166db-7f9a-4de8-aca7-783d136f8480/volumes" Apr 17 16:36:02.918441 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:02.918382 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59fc7c8489-m2gx8" podUID="245ec869-f2a1-4769-8d64-d2bd9e0bca15" containerName="console" containerID="cri-o://3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7" gracePeriod=15 Apr 17 16:36:03.167326 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.167300 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fc7c8489-m2gx8_245ec869-f2a1-4769-8d64-d2bd9e0bca15/console/0.log" Apr 17 16:36:03.167435 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.167360 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:36:03.240906 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.240838 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-oauth-serving-cert\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.240906 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.240885 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-service-ca\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.240906 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.240908 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-config\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.241158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241000 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzd5\" (UniqueName: \"kubernetes.io/projected/245ec869-f2a1-4769-8d64-d2bd9e0bca15-kube-api-access-qkzd5\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.241158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241055 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-oauth-config\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.241158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241094 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-serving-cert\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.241318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241166 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-trusted-ca-bundle\") pod \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\" (UID: \"245ec869-f2a1-4769-8d64-d2bd9e0bca15\") " Apr 17 16:36:03.241318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241232 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-config" (OuterVolumeSpecName: "console-config") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:03.241318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241257 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-service-ca" (OuterVolumeSpecName: "service-ca") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:03.241318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241264 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:03.241519 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241425 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-oauth-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.241519 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241444 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-service-ca\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.241519 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241457 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.241670 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.241564 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:36:03.243377 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.243352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:36:03.243643 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.243622 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:36:03.243738 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.243640 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245ec869-f2a1-4769-8d64-d2bd9e0bca15-kube-api-access-qkzd5" (OuterVolumeSpecName: "kube-api-access-qkzd5") pod "245ec869-f2a1-4769-8d64-d2bd9e0bca15" (UID: "245ec869-f2a1-4769-8d64-d2bd9e0bca15"). InnerVolumeSpecName "kube-api-access-qkzd5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:36:03.341931 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.341905 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkzd5\" (UniqueName: \"kubernetes.io/projected/245ec869-f2a1-4769-8d64-d2bd9e0bca15-kube-api-access-qkzd5\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.341931 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.341928 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-oauth-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.342069 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.341937 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/245ec869-f2a1-4769-8d64-d2bd9e0bca15-console-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.342069 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.341945 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245ec869-f2a1-4769-8d64-d2bd9e0bca15-trusted-ca-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:03.890250 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.890222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59fc7c8489-m2gx8_245ec869-f2a1-4769-8d64-d2bd9e0bca15/console/0.log" Apr 17 16:36:03.890412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.890260 2572 generic.go:358] "Generic (PLEG): container finished" podID="245ec869-f2a1-4769-8d64-d2bd9e0bca15" containerID="3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7" exitCode=2 Apr 17 16:36:03.890412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.890295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fc7c8489-m2gx8" event={"ID":"245ec869-f2a1-4769-8d64-d2bd9e0bca15","Type":"ContainerDied","Data":"3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7"} Apr 17 16:36:03.890412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.890324 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59fc7c8489-m2gx8" Apr 17 16:36:03.890412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.890336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59fc7c8489-m2gx8" event={"ID":"245ec869-f2a1-4769-8d64-d2bd9e0bca15","Type":"ContainerDied","Data":"82065e4ba8904ded28fa39fb685d9e0559fd14c428f13f7e81e5a13e7fad0bfa"} Apr 17 16:36:03.890412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.890353 2572 scope.go:117] "RemoveContainer" containerID="3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7" Apr 17 16:36:03.898965 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.898948 2572 scope.go:117] "RemoveContainer" containerID="3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7" Apr 17 16:36:03.899206 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:03.899189 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7\": container with ID starting with 3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7 not found: ID does not exist" containerID="3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7" Apr 17 16:36:03.899256 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.899215 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7"} err="failed to get container status \"3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7\": rpc error: code = NotFound desc = could not find container \"3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7\": container with ID starting with 3e0e371ec91f5ab72251847a1339397d93cfe58045b415cf895543a7263767f7 not found: ID does not exist" Apr 17 16:36:03.909759 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.909730 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59fc7c8489-m2gx8"] Apr 17 16:36:03.912284 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.912261 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59fc7c8489-m2gx8"] Apr 17 16:36:03.973425 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:03.973398 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245ec869-f2a1-4769-8d64-d2bd9e0bca15" path="/var/lib/kubelet/pods/245ec869-f2a1-4769-8d64-d2bd9e0bca15/volumes" Apr 17 16:36:11.263364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263327 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr"] Apr 17 16:36:11.263739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263615 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="245ec869-f2a1-4769-8d64-d2bd9e0bca15" containerName="console" Apr 17 16:36:11.263739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263626 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="245ec869-f2a1-4769-8d64-d2bd9e0bca15" containerName="console" Apr 17 16:36:11.263739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263635 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="133166db-7f9a-4de8-aca7-783d136f8480" containerName="console" Apr 17 16:36:11.263739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263640 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="133166db-7f9a-4de8-aca7-783d136f8480" containerName="console" Apr 17 16:36:11.263739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263695 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="133166db-7f9a-4de8-aca7-783d136f8480" containerName="console" Apr 17 16:36:11.263739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.263703 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="245ec869-f2a1-4769-8d64-d2bd9e0bca15" containerName="console" Apr 17 16:36:11.266132 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.266117 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.268592 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.268568 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:36:11.268735 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.268568 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:36:11.268735 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.268630 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q9kms\"" Apr 17 16:36:11.274509 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.274484 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr"] Apr 17 16:36:11.306228 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.306205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652ff\" (UniqueName: \"kubernetes.io/projected/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-kube-api-access-652ff\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.306339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.306243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.306339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.306311 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.407324 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.407289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-652ff\" (UniqueName: \"kubernetes.io/projected/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-kube-api-access-652ff\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.407465 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.407331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.407465 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.407375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.407706 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.407691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.407780 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.407745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.416503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.416480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-652ff\" (UniqueName: \"kubernetes.io/projected/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-kube-api-access-652ff\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.575957 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.575877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:11.695828 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.695768 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr"] Apr 17 16:36:11.697843 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:36:11.697818 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd006f1b9_6e84_4812_83a6_48d0cb0c8a00.slice/crio-09fca8d06f4aee1a1636f32df8f261c52277370b0774c06e99c69fa40235f8c7 WatchSource:0}: Error finding container 09fca8d06f4aee1a1636f32df8f261c52277370b0774c06e99c69fa40235f8c7: Status 404 returned error can't find the container with id 09fca8d06f4aee1a1636f32df8f261c52277370b0774c06e99c69fa40235f8c7 Apr 17 16:36:11.915597 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:11.915564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" event={"ID":"d006f1b9-6e84-4812-83a6-48d0cb0c8a00","Type":"ContainerStarted","Data":"09fca8d06f4aee1a1636f32df8f261c52277370b0774c06e99c69fa40235f8c7"} Apr 17 16:36:17.836502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:17.836417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:36:17.836502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:17.836417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:36:17.841997 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:17.841978 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:36:17.842056 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:17.842018 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:36:17.934357 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:17.934203 2572 generic.go:358] "Generic (PLEG): container finished" podID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerID="3c52c8b9fb5fc3670b0b8202ab5c90271f2196971f313bffede77543cf6d3a4b" exitCode=0 Apr 17 16:36:17.934357 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:17.934237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" event={"ID":"d006f1b9-6e84-4812-83a6-48d0cb0c8a00","Type":"ContainerDied","Data":"3c52c8b9fb5fc3670b0b8202ab5c90271f2196971f313bffede77543cf6d3a4b"} Apr 17 16:36:20.948624 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:20.948592 2572 generic.go:358] "Generic (PLEG): container finished" podID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerID="4871c1050148d751a32ea702ace14b88fb964b1e544a847773604b43b3359a3a" exitCode=0 Apr 17 16:36:20.949019 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:20.948641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" event={"ID":"d006f1b9-6e84-4812-83a6-48d0cb0c8a00","Type":"ContainerDied","Data":"4871c1050148d751a32ea702ace14b88fb964b1e544a847773604b43b3359a3a"} Apr 17 16:36:20.949583 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:20.949564 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:36:29.980040 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:29.980008 2572 generic.go:358] "Generic (PLEG): container finished" podID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerID="c290b7bf3dd401fef533a4319d343b0b6799feca671a4115d3ea83ee158f904f" exitCode=0 Apr 17 16:36:29.980434 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:29.980059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" event={"ID":"d006f1b9-6e84-4812-83a6-48d0cb0c8a00","Type":"ContainerDied","Data":"c290b7bf3dd401fef533a4319d343b0b6799feca671a4115d3ea83ee158f904f"} Apr 17 16:36:31.105226 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.105205 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:31.177926 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.177896 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-652ff\" (UniqueName: \"kubernetes.io/projected/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-kube-api-access-652ff\") pod \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " Apr 17 16:36:31.178042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.177970 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-util\") pod \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " Apr 17 16:36:31.178042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.178009 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-bundle\") pod \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\" (UID: \"d006f1b9-6e84-4812-83a6-48d0cb0c8a00\") " Apr 17 16:36:31.178613 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.178542 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-bundle" (OuterVolumeSpecName: "bundle") pod "d006f1b9-6e84-4812-83a6-48d0cb0c8a00" (UID: "d006f1b9-6e84-4812-83a6-48d0cb0c8a00"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:36:31.180141 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.180113 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-kube-api-access-652ff" (OuterVolumeSpecName: "kube-api-access-652ff") pod "d006f1b9-6e84-4812-83a6-48d0cb0c8a00" (UID: "d006f1b9-6e84-4812-83a6-48d0cb0c8a00"). InnerVolumeSpecName "kube-api-access-652ff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:36:31.182011 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.181990 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-util" (OuterVolumeSpecName: "util") pod "d006f1b9-6e84-4812-83a6-48d0cb0c8a00" (UID: "d006f1b9-6e84-4812-83a6-48d0cb0c8a00"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:36:31.279358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.279287 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:31.279358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.279313 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-652ff\" (UniqueName: \"kubernetes.io/projected/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-kube-api-access-652ff\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:31.279358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.279324 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d006f1b9-6e84-4812-83a6-48d0cb0c8a00-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:36:31.987881 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.987850 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" Apr 17 16:36:31.988013 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.987844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv84lr" event={"ID":"d006f1b9-6e84-4812-83a6-48d0cb0c8a00","Type":"ContainerDied","Data":"09fca8d06f4aee1a1636f32df8f261c52277370b0774c06e99c69fa40235f8c7"} Apr 17 16:36:31.988013 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:31.987959 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09fca8d06f4aee1a1636f32df8f261c52277370b0774c06e99c69fa40235f8c7" Apr 17 16:36:38.469598 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.469561 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k"] Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470022 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="util" Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470041 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="util" Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470061 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="extract" Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470069 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="extract" Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470086 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="pull" Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470094 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="pull" Apr 17 16:36:38.470176 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.470169 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d006f1b9-6e84-4812-83a6-48d0cb0c8a00" containerName="extract" Apr 17 16:36:38.472985 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.472965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.475159 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.475142 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 16:36:38.475300 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.475282 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 16:36:38.475343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.475284 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-rc4wq\"" Apr 17 16:36:38.475409 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.475290 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 16:36:38.484771 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.484753 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k"] Apr 17 16:36:38.635995 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.635963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgfqv\" (UniqueName: \"kubernetes.io/projected/4638b180-9946-4931-8025-e2a7a2e1b594-kube-api-access-rgfqv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k\" (UID: \"4638b180-9946-4931-8025-e2a7a2e1b594\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.636187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.636035 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4638b180-9946-4931-8025-e2a7a2e1b594-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k\" (UID: \"4638b180-9946-4931-8025-e2a7a2e1b594\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.736870 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.736795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgfqv\" (UniqueName: \"kubernetes.io/projected/4638b180-9946-4931-8025-e2a7a2e1b594-kube-api-access-rgfqv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k\" (UID: \"4638b180-9946-4931-8025-e2a7a2e1b594\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.737007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.736877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4638b180-9946-4931-8025-e2a7a2e1b594-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k\" (UID: \"4638b180-9946-4931-8025-e2a7a2e1b594\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.739355 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.739333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4638b180-9946-4931-8025-e2a7a2e1b594-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k\" (UID: \"4638b180-9946-4931-8025-e2a7a2e1b594\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.745526 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.745499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgfqv\" (UniqueName: \"kubernetes.io/projected/4638b180-9946-4931-8025-e2a7a2e1b594-kube-api-access-rgfqv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k\" (UID: \"4638b180-9946-4931-8025-e2a7a2e1b594\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.782395 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.782359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:38.904462 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:38.904433 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k"] Apr 17 16:36:38.907992 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:36:38.907962 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4638b180_9946_4931_8025_e2a7a2e1b594.slice/crio-bb12aa71c9158c8f01846819d8713b475d9892f59e60be2ccece96fa3699213e WatchSource:0}: Error finding container bb12aa71c9158c8f01846819d8713b475d9892f59e60be2ccece96fa3699213e: Status 404 returned error can't find the container with id bb12aa71c9158c8f01846819d8713b475d9892f59e60be2ccece96fa3699213e Apr 17 16:36:39.009106 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:39.009032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" event={"ID":"4638b180-9946-4931-8025-e2a7a2e1b594","Type":"ContainerStarted","Data":"bb12aa71c9158c8f01846819d8713b475d9892f59e60be2ccece96fa3699213e"} Apr 17 16:36:42.534359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.534325 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b69gd"] Apr 17 16:36:42.537604 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.537585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.539987 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.539965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 16:36:42.540096 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.539983 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p2f7z\"" Apr 17 16:36:42.540465 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.540449 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 16:36:42.548359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.548337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b69gd"] Apr 17 16:36:42.672560 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.672522 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.672560 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.672557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85r49\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-kube-api-access-85r49\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.672793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.672584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0070cb8f-71eb-4133-b001-dfc699ae3d5b-cabundle0\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.773834 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.773807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0070cb8f-71eb-4133-b001-dfc699ae3d5b-cabundle0\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.773976 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.773880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.773976 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.773900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85r49\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-kube-api-access-85r49\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.774055 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:42.774014 2572 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:36:42.774055 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:42.774032 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:36:42.774055 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:42.774043 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b69gd: references non-existent secret key: ca.crt Apr 17 16:36:42.774177 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:42.774097 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates podName:0070cb8f-71eb-4133-b001-dfc699ae3d5b nodeName:}" failed. No retries permitted until 2026-04-17 16:36:43.274082187 +0000 UTC m=+325.861671135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates") pod "keda-operator-ffbb595cb-b69gd" (UID: "0070cb8f-71eb-4133-b001-dfc699ae3d5b") : references non-existent secret key: ca.crt Apr 17 16:36:42.774420 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.774402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0070cb8f-71eb-4133-b001-dfc699ae3d5b-cabundle0\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:42.782391 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:42.782362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85r49\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-kube-api-access-85r49\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:43.025034 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:43.024993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" event={"ID":"4638b180-9946-4931-8025-e2a7a2e1b594","Type":"ContainerStarted","Data":"0159f9798a3cee716adb7bc835ac151825233a97005b6423e186ecc3bb681cbb"} Apr 17 16:36:43.025226 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:43.025134 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:36:43.045942 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:43.045896 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" podStartSLOduration=1.958738682 podStartE2EDuration="5.045881991s" podCreationTimestamp="2026-04-17 16:36:38 +0000 UTC" firstStartedPulling="2026-04-17 16:36:38.910089385 +0000 UTC m=+321.497678340" lastFinishedPulling="2026-04-17 16:36:41.997232698 +0000 UTC m=+324.584821649" observedRunningTime="2026-04-17 16:36:43.044566579 +0000 UTC m=+325.632155558" watchObservedRunningTime="2026-04-17 16:36:43.045881991 +0000 UTC m=+325.633470960" Apr 17 16:36:43.278377 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:43.278283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:43.278538 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:43.278440 2572 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:36:43.278538 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:43.278465 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:36:43.278538 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:43.278477 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b69gd: references non-existent secret key: ca.crt Apr 17 16:36:43.278538 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:43.278538 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates podName:0070cb8f-71eb-4133-b001-dfc699ae3d5b nodeName:}" failed. No retries permitted until 2026-04-17 16:36:44.27852035 +0000 UTC m=+326.866109303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates") pod "keda-operator-ffbb595cb-b69gd" (UID: "0070cb8f-71eb-4133-b001-dfc699ae3d5b") : references non-existent secret key: ca.crt Apr 17 16:36:44.286215 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:44.286178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:44.286547 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:44.286318 2572 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:36:44.286547 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:44.286336 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:36:44.286547 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:44.286346 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b69gd: references non-existent secret key: ca.crt Apr 17 16:36:44.286547 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:44.286410 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates podName:0070cb8f-71eb-4133-b001-dfc699ae3d5b nodeName:}" failed. No retries permitted until 2026-04-17 16:36:46.286395429 +0000 UTC m=+328.873984377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates") pod "keda-operator-ffbb595cb-b69gd" (UID: "0070cb8f-71eb-4133-b001-dfc699ae3d5b") : references non-existent secret key: ca.crt Apr 17 16:36:46.303576 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:46.303541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:46.303976 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:46.303672 2572 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:36:46.303976 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:46.303688 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:36:46.303976 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:46.303696 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b69gd: references non-existent secret key: ca.crt Apr 17 16:36:46.303976 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:36:46.303763 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates podName:0070cb8f-71eb-4133-b001-dfc699ae3d5b nodeName:}" failed. No retries permitted until 2026-04-17 16:36:50.303750824 +0000 UTC m=+332.891339771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates") pod "keda-operator-ffbb595cb-b69gd" (UID: "0070cb8f-71eb-4133-b001-dfc699ae3d5b") : references non-existent secret key: ca.crt Apr 17 16:36:50.331045 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:50.331014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:50.333527 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:50.333506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0070cb8f-71eb-4133-b001-dfc699ae3d5b-certificates\") pod \"keda-operator-ffbb595cb-b69gd\" (UID: \"0070cb8f-71eb-4133-b001-dfc699ae3d5b\") " pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:50.347318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:50.347295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:50.469807 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:50.469781 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b69gd"] Apr 17 16:36:50.472761 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:36:50.472736 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0070cb8f_71eb_4133_b001_dfc699ae3d5b.slice/crio-0cb486c431d59b5088dea549e9df98bdd9a03736b96a0664d92460d4ccb50c06 WatchSource:0}: Error finding container 0cb486c431d59b5088dea549e9df98bdd9a03736b96a0664d92460d4ccb50c06: Status 404 returned error can't find the container with id 0cb486c431d59b5088dea549e9df98bdd9a03736b96a0664d92460d4ccb50c06 Apr 17 16:36:51.052511 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:51.052469 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" event={"ID":"0070cb8f-71eb-4133-b001-dfc699ae3d5b","Type":"ContainerStarted","Data":"0cb486c431d59b5088dea549e9df98bdd9a03736b96a0664d92460d4ccb50c06"} Apr 17 16:36:54.063000 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:54.062965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" event={"ID":"0070cb8f-71eb-4133-b001-dfc699ae3d5b","Type":"ContainerStarted","Data":"89afedec2ac63a17bb271b4e1c893c22b2071dcd55123fd24dc427b0c571e88e"} Apr 17 16:36:54.063373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:54.063140 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:36:54.080453 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:36:54.080411 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" podStartSLOduration=9.277762881 podStartE2EDuration="12.080397624s" podCreationTimestamp="2026-04-17 16:36:42 +0000 UTC" firstStartedPulling="2026-04-17 16:36:50.474401811 +0000 UTC m=+333.061990759" lastFinishedPulling="2026-04-17 16:36:53.277036551 +0000 UTC m=+335.864625502" observedRunningTime="2026-04-17 16:36:54.078969526 +0000 UTC m=+336.666558495" watchObservedRunningTime="2026-04-17 16:36:54.080397624 +0000 UTC m=+336.667986595" Apr 17 16:37:04.031070 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:04.031041 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7lr2k" Apr 17 16:37:15.069064 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:15.068987 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-b69gd" Apr 17 16:37:36.293288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.293252 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59"] Apr 17 16:37:36.296648 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.296628 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.299102 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.299083 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:37:36.299840 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.299824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q9kms\"" Apr 17 16:37:36.300308 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.300292 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:37:36.309631 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.309612 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59"] Apr 17 16:37:36.396495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.396468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnwml\" (UniqueName: \"kubernetes.io/projected/4e1755e5-24b9-43d7-b755-19d6a6f219e4-kube-api-access-hnwml\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.396620 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.396510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.396620 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.396571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.502651 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.502616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.502834 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.502744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnwml\" (UniqueName: \"kubernetes.io/projected/4e1755e5-24b9-43d7-b755-19d6a6f219e4-kube-api-access-hnwml\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.502834 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.502800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.503097 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.503076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.503097 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.503090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.512185 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.512161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnwml\" (UniqueName: \"kubernetes.io/projected/4e1755e5-24b9-43d7-b755-19d6a6f219e4-kube-api-access-hnwml\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.606193 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.606169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:36.730368 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:36.730346 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59"] Apr 17 16:37:36.732904 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:37:36.732873 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e1755e5_24b9_43d7_b755_19d6a6f219e4.slice/crio-3638a8403661f09a3f9296e5503e31d3b3986b962de3de04ed0b55915ac8ba9c WatchSource:0}: Error finding container 3638a8403661f09a3f9296e5503e31d3b3986b962de3de04ed0b55915ac8ba9c: Status 404 returned error can't find the container with id 3638a8403661f09a3f9296e5503e31d3b3986b962de3de04ed0b55915ac8ba9c Apr 17 16:37:37.197490 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:37.197455 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerID="d4ef46bcc3b6ff20e6e940ad0dea351726a66dcd4b306c3defb8b7c0f2de6ee3" exitCode=0 Apr 17 16:37:37.197657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:37.197552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" event={"ID":"4e1755e5-24b9-43d7-b755-19d6a6f219e4","Type":"ContainerDied","Data":"d4ef46bcc3b6ff20e6e940ad0dea351726a66dcd4b306c3defb8b7c0f2de6ee3"} Apr 17 16:37:37.197657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:37.197591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" event={"ID":"4e1755e5-24b9-43d7-b755-19d6a6f219e4","Type":"ContainerStarted","Data":"3638a8403661f09a3f9296e5503e31d3b3986b962de3de04ed0b55915ac8ba9c"} Apr 17 16:37:38.202044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:38.202019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" event={"ID":"4e1755e5-24b9-43d7-b755-19d6a6f219e4","Type":"ContainerStarted","Data":"96c3ade79cb1b434538c459b4dcdf6d8111b41947a58efe7dfe3818e2594b497"} Apr 17 16:37:39.207312 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:39.207274 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerID="96c3ade79cb1b434538c459b4dcdf6d8111b41947a58efe7dfe3818e2594b497" exitCode=0 Apr 17 16:37:39.207312 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:39.207317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" event={"ID":"4e1755e5-24b9-43d7-b755-19d6a6f219e4","Type":"ContainerDied","Data":"96c3ade79cb1b434538c459b4dcdf6d8111b41947a58efe7dfe3818e2594b497"} Apr 17 16:37:40.211905 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:40.211873 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerID="1e6ba01e0a324ec4427db4af5b169365333bfc80814e36333fa8785bb50e4106" exitCode=0 Apr 17 16:37:40.212288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:40.211962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" event={"ID":"4e1755e5-24b9-43d7-b755-19d6a6f219e4","Type":"ContainerDied","Data":"1e6ba01e0a324ec4427db4af5b169365333bfc80814e36333fa8785bb50e4106"} Apr 17 16:37:41.333940 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.333918 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:41.341817 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.341796 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-util\") pod \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " Apr 17 16:37:41.341923 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.341857 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnwml\" (UniqueName: \"kubernetes.io/projected/4e1755e5-24b9-43d7-b755-19d6a6f219e4-kube-api-access-hnwml\") pod \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " Apr 17 16:37:41.341923 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.341875 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-bundle\") pod \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\" (UID: \"4e1755e5-24b9-43d7-b755-19d6a6f219e4\") " Apr 17 16:37:41.342629 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.342602 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-bundle" (OuterVolumeSpecName: "bundle") pod "4e1755e5-24b9-43d7-b755-19d6a6f219e4" (UID: "4e1755e5-24b9-43d7-b755-19d6a6f219e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:37:41.344013 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.343988 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1755e5-24b9-43d7-b755-19d6a6f219e4-kube-api-access-hnwml" (OuterVolumeSpecName: "kube-api-access-hnwml") pod "4e1755e5-24b9-43d7-b755-19d6a6f219e4" (UID: "4e1755e5-24b9-43d7-b755-19d6a6f219e4"). InnerVolumeSpecName "kube-api-access-hnwml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:37:41.347557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.347534 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-util" (OuterVolumeSpecName: "util") pod "4e1755e5-24b9-43d7-b755-19d6a6f219e4" (UID: "4e1755e5-24b9-43d7-b755-19d6a6f219e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:37:41.443187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.443149 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:37:41.443187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.443180 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnwml\" (UniqueName: \"kubernetes.io/projected/4e1755e5-24b9-43d7-b755-19d6a6f219e4-kube-api-access-hnwml\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:37:41.443187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:41.443191 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e1755e5-24b9-43d7-b755-19d6a6f219e4-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:37:42.218964 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:42.218931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" event={"ID":"4e1755e5-24b9-43d7-b755-19d6a6f219e4","Type":"ContainerDied","Data":"3638a8403661f09a3f9296e5503e31d3b3986b962de3de04ed0b55915ac8ba9c"} Apr 17 16:37:42.218964 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:42.218964 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3638a8403661f09a3f9296e5503e31d3b3986b962de3de04ed0b55915ac8ba9c" Apr 17 16:37:42.219172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:42.219023 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5g4q59" Apr 17 16:37:48.730144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730114 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7"] Apr 17 16:37:48.730502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730415 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="pull" Apr 17 16:37:48.730502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730425 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="pull" Apr 17 16:37:48.730502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730437 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="extract" Apr 17 16:37:48.730502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730442 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="extract" Apr 17 16:37:48.730502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730456 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="util" Apr 17 16:37:48.730502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730461 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="util" Apr 17 16:37:48.730688 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.730505 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e1755e5-24b9-43d7-b755-19d6a6f219e4" containerName="extract" Apr 17 16:37:48.734489 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.734473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:48.737255 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.737232 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 16:37:48.738062 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.737474 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-w49b5\"" Apr 17 16:37:48.738062 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.737589 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:37:48.746544 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.746524 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7"] Apr 17 16:37:48.799539 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.799511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpnz5\" (UniqueName: \"kubernetes.io/projected/6b23b903-b2ac-4f32-8561-91c0539442f9-kube-api-access-fpnz5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5vpl7\" (UID: \"6b23b903-b2ac-4f32-8561-91c0539442f9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:48.799669 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.799557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b23b903-b2ac-4f32-8561-91c0539442f9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5vpl7\" (UID: \"6b23b903-b2ac-4f32-8561-91c0539442f9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:48.900618 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.900594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpnz5\" (UniqueName: \"kubernetes.io/projected/6b23b903-b2ac-4f32-8561-91c0539442f9-kube-api-access-fpnz5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5vpl7\" (UID: \"6b23b903-b2ac-4f32-8561-91c0539442f9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:48.900759 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.900641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b23b903-b2ac-4f32-8561-91c0539442f9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5vpl7\" (UID: \"6b23b903-b2ac-4f32-8561-91c0539442f9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:48.901053 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.901037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b23b903-b2ac-4f32-8561-91c0539442f9-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5vpl7\" (UID: \"6b23b903-b2ac-4f32-8561-91c0539442f9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:48.909465 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:48.909445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpnz5\" (UniqueName: \"kubernetes.io/projected/6b23b903-b2ac-4f32-8561-91c0539442f9-kube-api-access-fpnz5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5vpl7\" (UID: \"6b23b903-b2ac-4f32-8561-91c0539442f9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:49.046853 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:49.046783 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" Apr 17 16:37:49.170555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:49.170531 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7"] Apr 17 16:37:49.172142 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:37:49.172105 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b23b903_b2ac_4f32_8561_91c0539442f9.slice/crio-1504470e456ff22e09474afbab534a349980bc37f901b73b64339c57ec53d9d5 WatchSource:0}: Error finding container 1504470e456ff22e09474afbab534a349980bc37f901b73b64339c57ec53d9d5: Status 404 returned error can't find the container with id 1504470e456ff22e09474afbab534a349980bc37f901b73b64339c57ec53d9d5 Apr 17 16:37:49.242494 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:49.242463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" event={"ID":"6b23b903-b2ac-4f32-8561-91c0539442f9","Type":"ContainerStarted","Data":"1504470e456ff22e09474afbab534a349980bc37f901b73b64339c57ec53d9d5"} Apr 17 16:37:51.250986 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:51.250946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" event={"ID":"6b23b903-b2ac-4f32-8561-91c0539442f9","Type":"ContainerStarted","Data":"8ad0b6fe13a7c46038b3634ecfb621002d7bd65085f912d33fed67ea2e467af6"} Apr 17 16:37:51.275373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:51.275197 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5vpl7" podStartSLOduration=1.650801357 podStartE2EDuration="3.275180107s" podCreationTimestamp="2026-04-17 16:37:48 +0000 UTC" firstStartedPulling="2026-04-17 16:37:49.174597942 +0000 UTC m=+391.762186904" lastFinishedPulling="2026-04-17 16:37:50.798976706 +0000 UTC m=+393.386565654" observedRunningTime="2026-04-17 16:37:51.272429032 +0000 UTC m=+393.860018003" watchObservedRunningTime="2026-04-17 16:37:51.275180107 +0000 UTC m=+393.862769077" Apr 17 16:37:52.590612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.590578 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf"] Apr 17 16:37:52.594936 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.594913 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.597816 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.597789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q9kms\"" Apr 17 16:37:52.597934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.597851 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:37:52.598529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.598511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:37:52.609315 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.609291 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf"] Apr 17 16:37:52.631101 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.631077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v2xt\" (UniqueName: \"kubernetes.io/projected/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-kube-api-access-7v2xt\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.631205 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.631125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.631205 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.631175 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.731533 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.731503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.731674 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.731547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.731674 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.731592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v2xt\" (UniqueName: \"kubernetes.io/projected/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-kube-api-access-7v2xt\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.731971 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.731950 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.732016 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.731959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.757220 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.757194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v2xt\" (UniqueName: \"kubernetes.io/projected/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-kube-api-access-7v2xt\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:52.904803 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:52.904709 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:53.055230 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:53.055204 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf"] Apr 17 16:37:53.057026 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:37:53.057000 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26efcb2c_f5e8_4e86_85fa_27ba5ef7a0f2.slice/crio-cb55ec6ebfcb996bdb62b13ff16f4b16c21461c0f9113e0f3a991889f00b08f9 WatchSource:0}: Error finding container cb55ec6ebfcb996bdb62b13ff16f4b16c21461c0f9113e0f3a991889f00b08f9: Status 404 returned error can't find the container with id cb55ec6ebfcb996bdb62b13ff16f4b16c21461c0f9113e0f3a991889f00b08f9 Apr 17 16:37:53.267323 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:53.267292 2572 generic.go:358] "Generic (PLEG): container finished" podID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerID="aee0facc2fe80d596a7bdd3087c3f9786567a500af0c08dc533ebffcddf16075" exitCode=0 Apr 17 16:37:53.267473 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:53.267385 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" event={"ID":"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2","Type":"ContainerDied","Data":"aee0facc2fe80d596a7bdd3087c3f9786567a500af0c08dc533ebffcddf16075"} Apr 17 16:37:53.267473 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:53.267428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" event={"ID":"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2","Type":"ContainerStarted","Data":"cb55ec6ebfcb996bdb62b13ff16f4b16c21461c0f9113e0f3a991889f00b08f9"} Apr 17 16:37:54.928436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:54.928402 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jj649"] Apr 17 16:37:54.937475 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:54.937453 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:54.940244 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:54.940220 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 16:37:54.940516 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:54.940495 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 16:37:54.940958 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:54.940917 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jj649"] Apr 17 16:37:54.942054 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:54.942033 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-fql5c\"" Apr 17 16:37:55.053164 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.053092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm9rj\" (UniqueName: \"kubernetes.io/projected/a80b8115-3c2a-48af-a5d2-e10f4dd437be-kube-api-access-zm9rj\") pod \"cert-manager-webhook-597b96b99b-jj649\" (UID: \"a80b8115-3c2a-48af-a5d2-e10f4dd437be\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.053332 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.053288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80b8115-3c2a-48af-a5d2-e10f4dd437be-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jj649\" (UID: \"a80b8115-3c2a-48af-a5d2-e10f4dd437be\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.154674 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.154633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80b8115-3c2a-48af-a5d2-e10f4dd437be-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jj649\" (UID: \"a80b8115-3c2a-48af-a5d2-e10f4dd437be\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.154878 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.154690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm9rj\" (UniqueName: \"kubernetes.io/projected/a80b8115-3c2a-48af-a5d2-e10f4dd437be-kube-api-access-zm9rj\") pod \"cert-manager-webhook-597b96b99b-jj649\" (UID: \"a80b8115-3c2a-48af-a5d2-e10f4dd437be\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.163814 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.163781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80b8115-3c2a-48af-a5d2-e10f4dd437be-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jj649\" (UID: \"a80b8115-3c2a-48af-a5d2-e10f4dd437be\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.164392 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.164365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm9rj\" (UniqueName: \"kubernetes.io/projected/a80b8115-3c2a-48af-a5d2-e10f4dd437be-kube-api-access-zm9rj\") pod \"cert-manager-webhook-597b96b99b-jj649\" (UID: \"a80b8115-3c2a-48af-a5d2-e10f4dd437be\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.256136 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.256106 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:55.385002 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:55.384978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jj649"] Apr 17 16:37:55.386695 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:37:55.386671 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda80b8115_3c2a_48af_a5d2_e10f4dd437be.slice/crio-7d7c1a201ed3370e6846843a0667f40165e2102234beadac5bfada4254adc698 WatchSource:0}: Error finding container 7d7c1a201ed3370e6846843a0667f40165e2102234beadac5bfada4254adc698: Status 404 returned error can't find the container with id 7d7c1a201ed3370e6846843a0667f40165e2102234beadac5bfada4254adc698 Apr 17 16:37:56.279168 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:56.279128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" event={"ID":"a80b8115-3c2a-48af-a5d2-e10f4dd437be","Type":"ContainerStarted","Data":"7d7c1a201ed3370e6846843a0667f40165e2102234beadac5bfada4254adc698"} Apr 17 16:37:56.280991 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:56.280959 2572 generic.go:358] "Generic (PLEG): container finished" podID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerID="2118b894737742fce6586f099dcee49e8a51698e1744112f4da75840aec576dc" exitCode=0 Apr 17 16:37:56.281126 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:56.281034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" event={"ID":"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2","Type":"ContainerDied","Data":"2118b894737742fce6586f099dcee49e8a51698e1744112f4da75840aec576dc"} Apr 17 16:37:57.287091 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.287053 2572 generic.go:358] "Generic (PLEG): container finished" podID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerID="074106707c8566e798644eceb54259f41f311ba197f68c9dbefd47291bab519f" exitCode=0 Apr 17 16:37:57.287512 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.287188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" event={"ID":"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2","Type":"ContainerDied","Data":"074106707c8566e798644eceb54259f41f311ba197f68c9dbefd47291bab519f"} Apr 17 16:37:57.943528 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.943491 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-lvtk7"] Apr 17 16:37:57.947076 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.947050 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:57.952555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.951851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-gdt4z\"" Apr 17 16:37:57.962105 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.962081 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-lvtk7"] Apr 17 16:37:57.978859 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.978833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff9sl\" (UniqueName: \"kubernetes.io/projected/48ccd28f-83a6-4c3c-83cc-98771021ba9e-kube-api-access-ff9sl\") pod \"cert-manager-cainjector-8966b78d4-lvtk7\" (UID: \"48ccd28f-83a6-4c3c-83cc-98771021ba9e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:57.978990 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:57.978876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48ccd28f-83a6-4c3c-83cc-98771021ba9e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-lvtk7\" (UID: \"48ccd28f-83a6-4c3c-83cc-98771021ba9e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:58.080300 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.080266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff9sl\" (UniqueName: \"kubernetes.io/projected/48ccd28f-83a6-4c3c-83cc-98771021ba9e-kube-api-access-ff9sl\") pod \"cert-manager-cainjector-8966b78d4-lvtk7\" (UID: \"48ccd28f-83a6-4c3c-83cc-98771021ba9e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:58.080469 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.080319 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48ccd28f-83a6-4c3c-83cc-98771021ba9e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-lvtk7\" (UID: \"48ccd28f-83a6-4c3c-83cc-98771021ba9e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:58.088909 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.088881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48ccd28f-83a6-4c3c-83cc-98771021ba9e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-lvtk7\" (UID: \"48ccd28f-83a6-4c3c-83cc-98771021ba9e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:58.089866 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.089839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff9sl\" (UniqueName: \"kubernetes.io/projected/48ccd28f-83a6-4c3c-83cc-98771021ba9e-kube-api-access-ff9sl\") pod \"cert-manager-cainjector-8966b78d4-lvtk7\" (UID: \"48ccd28f-83a6-4c3c-83cc-98771021ba9e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:58.264812 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.264789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" Apr 17 16:37:58.292279 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.292237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" event={"ID":"a80b8115-3c2a-48af-a5d2-e10f4dd437be","Type":"ContainerStarted","Data":"498c71d30191f334370cab9815a0964db67da28e80f348de0721675407ced65d"} Apr 17 16:37:58.292695 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.292378 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:37:58.312006 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.311949 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" podStartSLOduration=1.487415465 podStartE2EDuration="4.311935286s" podCreationTimestamp="2026-04-17 16:37:54 +0000 UTC" firstStartedPulling="2026-04-17 16:37:55.388615658 +0000 UTC m=+397.976204606" lastFinishedPulling="2026-04-17 16:37:58.21313548 +0000 UTC m=+400.800724427" observedRunningTime="2026-04-17 16:37:58.309789371 +0000 UTC m=+400.897378342" watchObservedRunningTime="2026-04-17 16:37:58.311935286 +0000 UTC m=+400.899524256" Apr 17 16:37:58.417503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.417483 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:58.421026 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.420975 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-lvtk7"] Apr 17 16:37:58.423040 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:37:58.423016 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ccd28f_83a6_4c3c_83cc_98771021ba9e.slice/crio-8285fe0c9ebc68fe4127d67ef9550b055684ce34fd7f1e5a74e028339515d5ba WatchSource:0}: Error finding container 8285fe0c9ebc68fe4127d67ef9550b055684ce34fd7f1e5a74e028339515d5ba: Status 404 returned error can't find the container with id 8285fe0c9ebc68fe4127d67ef9550b055684ce34fd7f1e5a74e028339515d5ba Apr 17 16:37:58.483054 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.483031 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-bundle\") pod \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " Apr 17 16:37:58.483178 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.483116 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-util\") pod \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " Apr 17 16:37:58.483178 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.483161 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v2xt\" (UniqueName: \"kubernetes.io/projected/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-kube-api-access-7v2xt\") pod \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\" (UID: \"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2\") " Apr 17 16:37:58.483431 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.483408 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-bundle" (OuterVolumeSpecName: "bundle") pod "26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" (UID: "26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:37:58.485145 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.485112 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-kube-api-access-7v2xt" (OuterVolumeSpecName: "kube-api-access-7v2xt") pod "26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" (UID: "26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2"). InnerVolumeSpecName "kube-api-access-7v2xt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:37:58.488021 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.487979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-util" (OuterVolumeSpecName: "util") pod "26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" (UID: "26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:37:58.584015 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.583976 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:37:58.584015 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.584007 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:37:58.584192 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:58.584022 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7v2xt\" (UniqueName: \"kubernetes.io/projected/26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2-kube-api-access-7v2xt\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:37:59.297310 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:59.297274 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" event={"ID":"48ccd28f-83a6-4c3c-83cc-98771021ba9e","Type":"ContainerStarted","Data":"e998327682377bbb7d503d426aac11bf59b7bf536181b1c34848589a14cb333b"} Apr 17 16:37:59.297310 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:59.297313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" event={"ID":"48ccd28f-83a6-4c3c-83cc-98771021ba9e","Type":"ContainerStarted","Data":"8285fe0c9ebc68fe4127d67ef9550b055684ce34fd7f1e5a74e028339515d5ba"} Apr 17 16:37:59.299109 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:59.299083 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" Apr 17 16:37:59.299109 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:59.299093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fppxvf" event={"ID":"26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2","Type":"ContainerDied","Data":"cb55ec6ebfcb996bdb62b13ff16f4b16c21461c0f9113e0f3a991889f00b08f9"} Apr 17 16:37:59.299295 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:59.299120 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb55ec6ebfcb996bdb62b13ff16f4b16c21461c0f9113e0f3a991889f00b08f9" Apr 17 16:37:59.312303 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:37:59.312248 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-lvtk7" podStartSLOduration=2.312238706 podStartE2EDuration="2.312238706s" podCreationTimestamp="2026-04-17 16:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:37:59.311239648 +0000 UTC m=+401.898828627" watchObservedRunningTime="2026-04-17 16:37:59.312238706 +0000 UTC m=+401.899827676" Apr 17 16:38:04.301542 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:04.301512 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jj649" Apr 17 16:38:05.314540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314507 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-dbm5z"] Apr 17 16:38:05.314944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314858 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="util" Apr 17 16:38:05.314944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314871 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="util" Apr 17 16:38:05.314944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314883 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="extract" Apr 17 16:38:05.314944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314889 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="extract" Apr 17 16:38:05.314944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314896 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="pull" Apr 17 16:38:05.314944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314902 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="pull" Apr 17 16:38:05.315118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.314973 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="26efcb2c-f5e8-4e86-85fa-27ba5ef7a0f2" containerName="extract" Apr 17 16:38:05.317664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.317642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.321294 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.321275 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dpqk6\"" Apr 17 16:38:05.339731 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.339689 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-dbm5z"] Apr 17 16:38:05.441002 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.440978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnts5\" (UniqueName: \"kubernetes.io/projected/6626b358-58b5-4c54-9a72-420dc06a86cf-kube-api-access-mnts5\") pod \"cert-manager-759f64656b-dbm5z\" (UID: \"6626b358-58b5-4c54-9a72-420dc06a86cf\") " pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.441154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.441028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6626b358-58b5-4c54-9a72-420dc06a86cf-bound-sa-token\") pod \"cert-manager-759f64656b-dbm5z\" (UID: \"6626b358-58b5-4c54-9a72-420dc06a86cf\") " pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.541740 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.541689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnts5\" (UniqueName: \"kubernetes.io/projected/6626b358-58b5-4c54-9a72-420dc06a86cf-kube-api-access-mnts5\") pod \"cert-manager-759f64656b-dbm5z\" (UID: \"6626b358-58b5-4c54-9a72-420dc06a86cf\") " pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.541867 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.541763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6626b358-58b5-4c54-9a72-420dc06a86cf-bound-sa-token\") pod \"cert-manager-759f64656b-dbm5z\" (UID: \"6626b358-58b5-4c54-9a72-420dc06a86cf\") " pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.550219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.550192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6626b358-58b5-4c54-9a72-420dc06a86cf-bound-sa-token\") pod \"cert-manager-759f64656b-dbm5z\" (UID: \"6626b358-58b5-4c54-9a72-420dc06a86cf\") " pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.550305 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.550249 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnts5\" (UniqueName: \"kubernetes.io/projected/6626b358-58b5-4c54-9a72-420dc06a86cf-kube-api-access-mnts5\") pod \"cert-manager-759f64656b-dbm5z\" (UID: \"6626b358-58b5-4c54-9a72-420dc06a86cf\") " pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.626493 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.626471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-dbm5z" Apr 17 16:38:05.778449 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:05.778404 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-dbm5z"] Apr 17 16:38:05.780795 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:38:05.780755 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6626b358_58b5_4c54_9a72_420dc06a86cf.slice/crio-f2279df2237e3c6af5ae315cb9028b70144debebbd0728ebc1aa44f7b3de9275 WatchSource:0}: Error finding container f2279df2237e3c6af5ae315cb9028b70144debebbd0728ebc1aa44f7b3de9275: Status 404 returned error can't find the container with id f2279df2237e3c6af5ae315cb9028b70144debebbd0728ebc1aa44f7b3de9275 Apr 17 16:38:06.323144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:06.323107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-dbm5z" event={"ID":"6626b358-58b5-4c54-9a72-420dc06a86cf","Type":"ContainerStarted","Data":"c4187317a835cb5c49da8cfb2d7fa425f5a68a520acecad5d8035b074c2e77f9"} Apr 17 16:38:06.323144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:06.323147 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-dbm5z" event={"ID":"6626b358-58b5-4c54-9a72-420dc06a86cf","Type":"ContainerStarted","Data":"f2279df2237e3c6af5ae315cb9028b70144debebbd0728ebc1aa44f7b3de9275"} Apr 17 16:38:06.344945 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:06.344895 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-dbm5z" podStartSLOduration=1.344880354 podStartE2EDuration="1.344880354s" podCreationTimestamp="2026-04-17 16:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:38:06.343644191 +0000 UTC m=+408.931233161" watchObservedRunningTime="2026-04-17 16:38:06.344880354 +0000 UTC m=+408.932469325" Apr 17 16:38:20.849470 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.846837 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r"] Apr 17 16:38:20.854264 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.854238 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:20.857041 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.857018 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q9kms\"" Apr 17 16:38:20.857148 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.857052 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:38:20.857412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.857393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:38:20.858776 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.858755 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r"] Apr 17 16:38:20.971483 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.971450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:20.971636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.971488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:20.971636 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:20.971510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56c2n\" (UniqueName: \"kubernetes.io/projected/01a3d02a-4572-474f-9997-a0d856332153-kube-api-access-56c2n\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.072223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.072186 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.072358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.072232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.072358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.072264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56c2n\" (UniqueName: \"kubernetes.io/projected/01a3d02a-4572-474f-9997-a0d856332153-kube-api-access-56c2n\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.072543 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.072527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.072588 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.072555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.080583 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.080561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56c2n\" (UniqueName: \"kubernetes.io/projected/01a3d02a-4572-474f-9997-a0d856332153-kube-api-access-56c2n\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.164269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.164205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:21.284195 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.284157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r"] Apr 17 16:38:21.285822 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:38:21.285796 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01a3d02a_4572_474f_9997_a0d856332153.slice/crio-a0c0add8dabb54e01e9a63fbeb09c90faa716a225bfd4d215dd9f97db8d1f536 WatchSource:0}: Error finding container a0c0add8dabb54e01e9a63fbeb09c90faa716a225bfd4d215dd9f97db8d1f536: Status 404 returned error can't find the container with id a0c0add8dabb54e01e9a63fbeb09c90faa716a225bfd4d215dd9f97db8d1f536 Apr 17 16:38:21.373689 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.373651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" event={"ID":"01a3d02a-4572-474f-9997-a0d856332153","Type":"ContainerStarted","Data":"be6719e0c8b37cb08bff76b2b79255bc6a5ef8bf5a327ab2078a7dc8a8e56c5c"} Apr 17 16:38:21.373689 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:21.373692 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" event={"ID":"01a3d02a-4572-474f-9997-a0d856332153","Type":"ContainerStarted","Data":"a0c0add8dabb54e01e9a63fbeb09c90faa716a225bfd4d215dd9f97db8d1f536"} Apr 17 16:38:22.378855 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:22.378824 2572 generic.go:358] "Generic (PLEG): container finished" podID="01a3d02a-4572-474f-9997-a0d856332153" containerID="be6719e0c8b37cb08bff76b2b79255bc6a5ef8bf5a327ab2078a7dc8a8e56c5c" exitCode=0 Apr 17 16:38:22.379201 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:22.378885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" event={"ID":"01a3d02a-4572-474f-9997-a0d856332153","Type":"ContainerDied","Data":"be6719e0c8b37cb08bff76b2b79255bc6a5ef8bf5a327ab2078a7dc8a8e56c5c"} Apr 17 16:38:23.386948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:23.386852 2572 generic.go:358] "Generic (PLEG): container finished" podID="01a3d02a-4572-474f-9997-a0d856332153" containerID="ab11a97844e0e8e999eecc4f97be9d191e1642336d38fb4eaa78906950623af7" exitCode=0 Apr 17 16:38:23.387348 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:23.386936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" event={"ID":"01a3d02a-4572-474f-9997-a0d856332153","Type":"ContainerDied","Data":"ab11a97844e0e8e999eecc4f97be9d191e1642336d38fb4eaa78906950623af7"} Apr 17 16:38:24.392384 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:24.392348 2572 generic.go:358] "Generic (PLEG): container finished" podID="01a3d02a-4572-474f-9997-a0d856332153" containerID="fac14ddc5115e1b51158f20f76aa49db4f1be5e30a586346bde9784d53d08f1b" exitCode=0 Apr 17 16:38:24.392843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:24.392402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" event={"ID":"01a3d02a-4572-474f-9997-a0d856332153","Type":"ContainerDied","Data":"fac14ddc5115e1b51158f20f76aa49db4f1be5e30a586346bde9784d53d08f1b"} Apr 17 16:38:25.520597 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.520575 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:25.608699 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.608664 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-bundle\") pod \"01a3d02a-4572-474f-9997-a0d856332153\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " Apr 17 16:38:25.608892 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.608778 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56c2n\" (UniqueName: \"kubernetes.io/projected/01a3d02a-4572-474f-9997-a0d856332153-kube-api-access-56c2n\") pod \"01a3d02a-4572-474f-9997-a0d856332153\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " Apr 17 16:38:25.608892 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.608804 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-util\") pod \"01a3d02a-4572-474f-9997-a0d856332153\" (UID: \"01a3d02a-4572-474f-9997-a0d856332153\") " Apr 17 16:38:25.609625 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.609598 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-bundle" (OuterVolumeSpecName: "bundle") pod "01a3d02a-4572-474f-9997-a0d856332153" (UID: "01a3d02a-4572-474f-9997-a0d856332153"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:38:25.611024 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.611001 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a3d02a-4572-474f-9997-a0d856332153-kube-api-access-56c2n" (OuterVolumeSpecName: "kube-api-access-56c2n") pod "01a3d02a-4572-474f-9997-a0d856332153" (UID: "01a3d02a-4572-474f-9997-a0d856332153"). InnerVolumeSpecName "kube-api-access-56c2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:38:25.614236 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.614213 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-util" (OuterVolumeSpecName: "util") pod "01a3d02a-4572-474f-9997-a0d856332153" (UID: "01a3d02a-4572-474f-9997-a0d856332153"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:38:25.709657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.709579 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:38:25.709657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.709608 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a3d02a-4572-474f-9997-a0d856332153-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:38:25.709657 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:25.709618 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-56c2n\" (UniqueName: \"kubernetes.io/projected/01a3d02a-4572-474f-9997-a0d856332153-kube-api-access-56c2n\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:38:26.401697 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:26.401659 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" event={"ID":"01a3d02a-4572-474f-9997-a0d856332153","Type":"ContainerDied","Data":"a0c0add8dabb54e01e9a63fbeb09c90faa716a225bfd4d215dd9f97db8d1f536"} Apr 17 16:38:26.401697 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:26.401701 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c0add8dabb54e01e9a63fbeb09c90faa716a225bfd4d215dd9f97db8d1f536" Apr 17 16:38:26.401987 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:26.401675 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835zrc7r" Apr 17 16:38:34.576244 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576204 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n"] Apr 17 16:38:34.576611 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576551 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="util" Apr 17 16:38:34.576611 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576565 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="util" Apr 17 16:38:34.576611 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576579 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="extract" Apr 17 16:38:34.576611 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576588 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="extract" Apr 17 16:38:34.576766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576617 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="pull" Apr 17 16:38:34.576766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576622 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="pull" Apr 17 16:38:34.576766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.576673 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="01a3d02a-4572-474f-9997-a0d856332153" containerName="extract" Apr 17 16:38:34.580801 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.580784 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.584621 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.584599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:38:34.586507 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.586487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-rdcg4\"" Apr 17 16:38:34.586632 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.586611 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 16:38:34.586836 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.586822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:38:34.586965 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.586942 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 16:38:34.587075 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.586993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 16:38:34.597189 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.597166 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n"] Apr 17 16:38:34.679711 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.679682 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-manager-config\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.679711 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.679711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-metrics-cert\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.679950 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.679852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbg6w\" (UniqueName: \"kubernetes.io/projected/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-kube-api-access-vbg6w\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.679950 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.679894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-cert\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.780619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.780546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-manager-config\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.780619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.780582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-metrics-cert\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.780837 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.780626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbg6w\" (UniqueName: \"kubernetes.io/projected/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-kube-api-access-vbg6w\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.780837 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.780652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-cert\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.781204 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.781182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-manager-config\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.783177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.783153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-metrics-cert\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.783358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.783337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-cert\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.799278 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.799254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbg6w\" (UniqueName: \"kubernetes.io/projected/a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63-kube-api-access-vbg6w\") pod \"lws-controller-manager-5f68f6fcb9-mtr9n\" (UID: \"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:34.889692 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:34.889661 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:35.024365 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:35.024342 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n"] Apr 17 16:38:35.027108 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:38:35.027074 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ddeb87_fbaf_4d48_84a3_44c8eaa95e63.slice/crio-bd32fb41710c96db5c8331cabcdb50ce7981a49172232c3c672f73184dae9259 WatchSource:0}: Error finding container bd32fb41710c96db5c8331cabcdb50ce7981a49172232c3c672f73184dae9259: Status 404 returned error can't find the container with id bd32fb41710c96db5c8331cabcdb50ce7981a49172232c3c672f73184dae9259 Apr 17 16:38:35.434559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:35.434522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" event={"ID":"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63","Type":"ContainerStarted","Data":"bd32fb41710c96db5c8331cabcdb50ce7981a49172232c3c672f73184dae9259"} Apr 17 16:38:36.440284 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:36.440238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" event={"ID":"a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63","Type":"ContainerStarted","Data":"647253cc6a25eedc00fd3c43f54b79266140eb35bd5d29668b146583139d2758"} Apr 17 16:38:36.440705 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:36.440374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:38:36.458667 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:36.458612 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" podStartSLOduration=1.27581642 podStartE2EDuration="2.458598609s" podCreationTimestamp="2026-04-17 16:38:34 +0000 UTC" firstStartedPulling="2026-04-17 16:38:35.030031768 +0000 UTC m=+437.617620731" lastFinishedPulling="2026-04-17 16:38:36.212813968 +0000 UTC m=+438.800402920" observedRunningTime="2026-04-17 16:38:36.457066038 +0000 UTC m=+439.044655039" watchObservedRunningTime="2026-04-17 16:38:36.458598609 +0000 UTC m=+439.046187579" Apr 17 16:38:38.335389 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.335354 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr"] Apr 17 16:38:38.338934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.338917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.341427 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.341405 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:38:38.342141 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.342127 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:38:38.342196 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.342137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q9kms\"" Apr 17 16:38:38.348515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.348495 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr"] Apr 17 16:38:38.405468 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.405434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.405468 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.405469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4fs8\" (UniqueName: \"kubernetes.io/projected/2b09a71f-258c-42e7-8eab-363c62580cb5-kube-api-access-q4fs8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.405627 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.405492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.506407 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.506379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.506515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.506412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4fs8\" (UniqueName: \"kubernetes.io/projected/2b09a71f-258c-42e7-8eab-363c62580cb5-kube-api-access-q4fs8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.506515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.506436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.506795 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.506778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.506841 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.506814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.517456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.517433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4fs8\" (UniqueName: \"kubernetes.io/projected/2b09a71f-258c-42e7-8eab-363c62580cb5-kube-api-access-q4fs8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.649120 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.649082 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:38.776110 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:38.776084 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr"] Apr 17 16:38:38.778154 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:38:38.778122 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b09a71f_258c_42e7_8eab_363c62580cb5.slice/crio-a2006203e99d602bf6d748f80444c950a951d09e84d19460bfc0c3a19f09c3ff WatchSource:0}: Error finding container a2006203e99d602bf6d748f80444c950a951d09e84d19460bfc0c3a19f09c3ff: Status 404 returned error can't find the container with id a2006203e99d602bf6d748f80444c950a951d09e84d19460bfc0c3a19f09c3ff Apr 17 16:38:39.451171 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:39.451139 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerID="0f674073a5fc5015585c3d4f922a67c503bdebcdfd5c5df7be26dd9ef4c48917" exitCode=0 Apr 17 16:38:39.451547 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:39.451201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" event={"ID":"2b09a71f-258c-42e7-8eab-363c62580cb5","Type":"ContainerDied","Data":"0f674073a5fc5015585c3d4f922a67c503bdebcdfd5c5df7be26dd9ef4c48917"} Apr 17 16:38:39.451547 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:39.451222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" event={"ID":"2b09a71f-258c-42e7-8eab-363c62580cb5","Type":"ContainerStarted","Data":"a2006203e99d602bf6d748f80444c950a951d09e84d19460bfc0c3a19f09c3ff"} Apr 17 16:38:40.456243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:40.456200 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerID="2872a9053674630a234f4b3e4bf58689a27865763af31f60457ccbd541406a40" exitCode=0 Apr 17 16:38:40.456609 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:40.456288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" event={"ID":"2b09a71f-258c-42e7-8eab-363c62580cb5","Type":"ContainerDied","Data":"2872a9053674630a234f4b3e4bf58689a27865763af31f60457ccbd541406a40"} Apr 17 16:38:41.461995 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:41.461953 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerID="1477b91cc6f9d7480fe376af5346befa91fe0db084220624dc7fc2b3abca7cc9" exitCode=0 Apr 17 16:38:41.462387 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:41.462034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" event={"ID":"2b09a71f-258c-42e7-8eab-363c62580cb5","Type":"ContainerDied","Data":"1477b91cc6f9d7480fe376af5346befa91fe0db084220624dc7fc2b3abca7cc9"} Apr 17 16:38:42.588982 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.588959 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:42.633148 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.633121 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-bundle\") pod \"2b09a71f-258c-42e7-8eab-363c62580cb5\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " Apr 17 16:38:42.633286 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.633209 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-util\") pod \"2b09a71f-258c-42e7-8eab-363c62580cb5\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " Apr 17 16:38:42.633286 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.633244 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4fs8\" (UniqueName: \"kubernetes.io/projected/2b09a71f-258c-42e7-8eab-363c62580cb5-kube-api-access-q4fs8\") pod \"2b09a71f-258c-42e7-8eab-363c62580cb5\" (UID: \"2b09a71f-258c-42e7-8eab-363c62580cb5\") " Apr 17 16:38:42.634052 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.634004 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-bundle" (OuterVolumeSpecName: "bundle") pod "2b09a71f-258c-42e7-8eab-363c62580cb5" (UID: "2b09a71f-258c-42e7-8eab-363c62580cb5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:38:42.635484 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.635456 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b09a71f-258c-42e7-8eab-363c62580cb5-kube-api-access-q4fs8" (OuterVolumeSpecName: "kube-api-access-q4fs8") pod "2b09a71f-258c-42e7-8eab-363c62580cb5" (UID: "2b09a71f-258c-42e7-8eab-363c62580cb5"). InnerVolumeSpecName "kube-api-access-q4fs8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:38:42.638625 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.638606 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-util" (OuterVolumeSpecName: "util") pod "2b09a71f-258c-42e7-8eab-363c62580cb5" (UID: "2b09a71f-258c-42e7-8eab-363c62580cb5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:38:42.734529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.734458 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:38:42.734529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.734484 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b09a71f-258c-42e7-8eab-363c62580cb5-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:38:42.734529 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:42.734495 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4fs8\" (UniqueName: \"kubernetes.io/projected/2b09a71f-258c-42e7-8eab-363c62580cb5-kube-api-access-q4fs8\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:38:43.470735 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:43.470687 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" event={"ID":"2b09a71f-258c-42e7-8eab-363c62580cb5","Type":"ContainerDied","Data":"a2006203e99d602bf6d748f80444c950a951d09e84d19460bfc0c3a19f09c3ff"} Apr 17 16:38:43.470910 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:43.470762 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2006203e99d602bf6d748f80444c950a951d09e84d19460bfc0c3a19f09c3ff" Apr 17 16:38:43.470910 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:43.470712 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sz5pr" Apr 17 16:38:47.445355 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:38:47.445326 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-mtr9n" Apr 17 16:39:18.190702 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.190667 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2"] Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191014 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="pull" Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191026 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="pull" Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191038 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="util" Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191044 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="util" Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191060 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="extract" Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191065 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="extract" Apr 17 16:39:18.191177 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.191116 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b09a71f-258c-42e7-8eab-363c62580cb5" containerName="extract" Apr 17 16:39:18.196944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.196928 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.199456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.199433 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:39:18.200147 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.200121 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q9kms\"" Apr 17 16:39:18.200312 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.200295 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:39:18.201948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.201923 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2"] Apr 17 16:39:18.295248 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.295214 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s"] Apr 17 16:39:18.299643 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.299623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.307969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.307949 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s"] Apr 17 16:39:18.330454 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.330429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvmp\" (UniqueName: \"kubernetes.io/projected/1f33dd23-b02a-476a-adda-e5395e1fff15-kube-api-access-8tvmp\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.330557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.330472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.330603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.330560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.398041 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.398012 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg"] Apr 17 16:39:18.402210 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.402194 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.409028 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.409005 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg"] Apr 17 16:39:18.431139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvmp\" (UniqueName: \"kubernetes.io/projected/1f33dd23-b02a-476a-adda-e5395e1fff15-kube-api-access-8tvmp\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.431243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.431243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.431243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skfgl\" (UniqueName: \"kubernetes.io/projected/c84364d9-6aa2-46c0-be51-142fda9d8c6b-kube-api-access-skfgl\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.431373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.431373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.431541 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.431585 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.431571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.440409 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.440388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvmp\" (UniqueName: \"kubernetes.io/projected/1f33dd23-b02a-476a-adda-e5395e1fff15-kube-api-access-8tvmp\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.498881 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.498827 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d"] Apr 17 16:39:18.503251 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.503237 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.506861 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.506839 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:18.513959 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.513929 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d"] Apr 17 16:39:18.532100 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spb6v\" (UniqueName: \"kubernetes.io/projected/c97e7982-9471-4e35-8802-3d643a4d14a5-kube-api-access-spb6v\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.532212 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skfgl\" (UniqueName: \"kubernetes.io/projected/c84364d9-6aa2-46c0-be51-142fda9d8c6b-kube-api-access-skfgl\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.532212 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.532328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.532328 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.532444 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.532645 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.532745 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.532655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.541197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.541175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skfgl\" (UniqueName: \"kubernetes.io/projected/c84364d9-6aa2-46c0-be51-142fda9d8c6b-kube-api-access-skfgl\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.608791 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.608764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:18.633758 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.633643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.633758 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.633685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbt56\" (UniqueName: \"kubernetes.io/projected/2419ec7b-d89b-499a-9e49-6eb1027fb271-kube-api-access-hbt56\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.633901 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.633790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.633901 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.633837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.633901 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.633878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spb6v\" (UniqueName: \"kubernetes.io/projected/c97e7982-9471-4e35-8802-3d643a4d14a5-kube-api-access-spb6v\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.634114 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.633917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.634172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.634146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.634228 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.634168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.634430 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.634409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2"] Apr 17 16:39:18.636137 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:39:18.636117 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f33dd23_b02a_476a_adda_e5395e1fff15.slice/crio-a428af485593e32fcda7f3fb6f33ddd1fd366e9e696cf60c1fbc27ff33c1a386 WatchSource:0}: Error finding container a428af485593e32fcda7f3fb6f33ddd1fd366e9e696cf60c1fbc27ff33c1a386: Status 404 returned error can't find the container with id a428af485593e32fcda7f3fb6f33ddd1fd366e9e696cf60c1fbc27ff33c1a386 Apr 17 16:39:18.642680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.642662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spb6v\" (UniqueName: \"kubernetes.io/projected/c97e7982-9471-4e35-8802-3d643a4d14a5-kube-api-access-spb6v\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.717947 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.717915 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:18.734453 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.734425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbt56\" (UniqueName: \"kubernetes.io/projected/2419ec7b-d89b-499a-9e49-6eb1027fb271-kube-api-access-hbt56\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.735118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.734549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.735118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.734602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.735118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.734974 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.735118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.735098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.738528 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.738505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s"] Apr 17 16:39:18.741349 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:39:18.741308 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc84364d9_6aa2_46c0_be51_142fda9d8c6b.slice/crio-85a2df2b6995e62ebf7a306595b9aef23ccfdcf6750ab8c41c22b2dbf32d0379 WatchSource:0}: Error finding container 85a2df2b6995e62ebf7a306595b9aef23ccfdcf6750ab8c41c22b2dbf32d0379: Status 404 returned error can't find the container with id 85a2df2b6995e62ebf7a306595b9aef23ccfdcf6750ab8c41c22b2dbf32d0379 Apr 17 16:39:18.743117 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.743093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbt56\" (UniqueName: \"kubernetes.io/projected/2419ec7b-d89b-499a-9e49-6eb1027fb271-kube-api-access-hbt56\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.814015 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.813200 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:18.852458 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.852431 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg"] Apr 17 16:39:18.854595 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:39:18.854460 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc97e7982_9471_4e35_8802_3d643a4d14a5.slice/crio-eafcf3fbda038447ad0f32a0161ffdd36c320923e30a418847b8ee5751bd1f83 WatchSource:0}: Error finding container eafcf3fbda038447ad0f32a0161ffdd36c320923e30a418847b8ee5751bd1f83: Status 404 returned error can't find the container with id eafcf3fbda038447ad0f32a0161ffdd36c320923e30a418847b8ee5751bd1f83 Apr 17 16:39:18.944577 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:18.944551 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d"] Apr 17 16:39:18.956209 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:39:18.956183 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2419ec7b_d89b_499a_9e49_6eb1027fb271.slice/crio-695f708d478b91351c7771dc3b1f1b982e96e79fed625123c96677172adec8d3 WatchSource:0}: Error finding container 695f708d478b91351c7771dc3b1f1b982e96e79fed625123c96677172adec8d3: Status 404 returned error can't find the container with id 695f708d478b91351c7771dc3b1f1b982e96e79fed625123c96677172adec8d3 Apr 17 16:39:19.103981 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:39:19.103954 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2419ec7b_d89b_499a_9e49_6eb1027fb271.slice/crio-e0c36773c392fb34bce122154cc4fd631a8a66e15c0f8e29299e48e452b56723.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:39:19.597832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.597797 2572 generic.go:358] "Generic (PLEG): container finished" podID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerID="3aa92489cce4615e8ea8f9aea15b2d45804c8e4d38aacd1a5be5b0149ee33ec1" exitCode=0 Apr 17 16:39:19.598332 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.597869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" event={"ID":"c84364d9-6aa2-46c0-be51-142fda9d8c6b","Type":"ContainerDied","Data":"3aa92489cce4615e8ea8f9aea15b2d45804c8e4d38aacd1a5be5b0149ee33ec1"} Apr 17 16:39:19.598332 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.597890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" event={"ID":"c84364d9-6aa2-46c0-be51-142fda9d8c6b","Type":"ContainerStarted","Data":"85a2df2b6995e62ebf7a306595b9aef23ccfdcf6750ab8c41c22b2dbf32d0379"} Apr 17 16:39:19.599364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.599341 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerID="8705fde5697a74ef67d11d5e8fe78d809b232151af687fdd2c960cfb5212e62e" exitCode=0 Apr 17 16:39:19.599456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.599420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" event={"ID":"1f33dd23-b02a-476a-adda-e5395e1fff15","Type":"ContainerDied","Data":"8705fde5697a74ef67d11d5e8fe78d809b232151af687fdd2c960cfb5212e62e"} Apr 17 16:39:19.599525 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.599463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" event={"ID":"1f33dd23-b02a-476a-adda-e5395e1fff15","Type":"ContainerStarted","Data":"a428af485593e32fcda7f3fb6f33ddd1fd366e9e696cf60c1fbc27ff33c1a386"} Apr 17 16:39:19.600888 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.600799 2572 generic.go:358] "Generic (PLEG): container finished" podID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerID="2c2493f631ae73748890580f1a3159a7f493802bc8d0c4bc95a41e73831d3c06" exitCode=0 Apr 17 16:39:19.600888 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.600822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" event={"ID":"c97e7982-9471-4e35-8802-3d643a4d14a5","Type":"ContainerDied","Data":"2c2493f631ae73748890580f1a3159a7f493802bc8d0c4bc95a41e73831d3c06"} Apr 17 16:39:19.600888 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.600864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" event={"ID":"c97e7982-9471-4e35-8802-3d643a4d14a5","Type":"ContainerStarted","Data":"eafcf3fbda038447ad0f32a0161ffdd36c320923e30a418847b8ee5751bd1f83"} Apr 17 16:39:19.602510 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.602493 2572 generic.go:358] "Generic (PLEG): container finished" podID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerID="e0c36773c392fb34bce122154cc4fd631a8a66e15c0f8e29299e48e452b56723" exitCode=0 Apr 17 16:39:19.602583 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.602558 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" event={"ID":"2419ec7b-d89b-499a-9e49-6eb1027fb271","Type":"ContainerDied","Data":"e0c36773c392fb34bce122154cc4fd631a8a66e15c0f8e29299e48e452b56723"} Apr 17 16:39:19.602583 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:19.602578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" event={"ID":"2419ec7b-d89b-499a-9e49-6eb1027fb271","Type":"ContainerStarted","Data":"695f708d478b91351c7771dc3b1f1b982e96e79fed625123c96677172adec8d3"} Apr 17 16:39:20.608541 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:20.608507 2572 generic.go:358] "Generic (PLEG): container finished" podID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerID="eb541496735e1db1f31207a7083768a6308f47400db606aa4af17bd646b63130" exitCode=0 Apr 17 16:39:20.608951 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:20.608597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" event={"ID":"2419ec7b-d89b-499a-9e49-6eb1027fb271","Type":"ContainerDied","Data":"eb541496735e1db1f31207a7083768a6308f47400db606aa4af17bd646b63130"} Apr 17 16:39:20.610421 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:20.610397 2572 generic.go:358] "Generic (PLEG): container finished" podID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerID="7f4ddb55ea2852830dacde07482463ae533caa2e3e6568bb9fe44016f78c71e9" exitCode=0 Apr 17 16:39:20.610515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:20.610427 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" event={"ID":"c84364d9-6aa2-46c0-be51-142fda9d8c6b","Type":"ContainerDied","Data":"7f4ddb55ea2852830dacde07482463ae533caa2e3e6568bb9fe44016f78c71e9"} Apr 17 16:39:20.612252 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:20.612232 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerID="fdb0b904de6932e6c84b65c61d234ba6e520f2a3ffecd82b3234dafc350b8011" exitCode=0 Apr 17 16:39:20.612352 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:20.612286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" event={"ID":"1f33dd23-b02a-476a-adda-e5395e1fff15","Type":"ContainerDied","Data":"fdb0b904de6932e6c84b65c61d234ba6e520f2a3ffecd82b3234dafc350b8011"} Apr 17 16:39:21.617903 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.617873 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerID="f3a9e370d015501ba93f76a8bae7cce7d58a6c46f5598ae4512e58d7a59ec574" exitCode=0 Apr 17 16:39:21.618308 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.617959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" event={"ID":"1f33dd23-b02a-476a-adda-e5395e1fff15","Type":"ContainerDied","Data":"f3a9e370d015501ba93f76a8bae7cce7d58a6c46f5598ae4512e58d7a59ec574"} Apr 17 16:39:21.619477 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.619455 2572 generic.go:358] "Generic (PLEG): container finished" podID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerID="05f52341b52e7d501ba7b79c2d553921e54a7788058bf44927b4d90b7d68f976" exitCode=0 Apr 17 16:39:21.619586 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.619533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" event={"ID":"c97e7982-9471-4e35-8802-3d643a4d14a5","Type":"ContainerDied","Data":"05f52341b52e7d501ba7b79c2d553921e54a7788058bf44927b4d90b7d68f976"} Apr 17 16:39:21.621639 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.621614 2572 generic.go:358] "Generic (PLEG): container finished" podID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerID="38c7a3d5ac404c3620aea87359097871178babfa026a2d9763b7d10ac72da858" exitCode=0 Apr 17 16:39:21.621765 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.621652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" event={"ID":"2419ec7b-d89b-499a-9e49-6eb1027fb271","Type":"ContainerDied","Data":"38c7a3d5ac404c3620aea87359097871178babfa026a2d9763b7d10ac72da858"} Apr 17 16:39:21.623637 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.623618 2572 generic.go:358] "Generic (PLEG): container finished" podID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerID="4458935a6abdce07c7aacd7d4935be0cf65e4a6099e8e05366d734498d67cc3c" exitCode=0 Apr 17 16:39:21.623773 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:21.623652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" event={"ID":"c84364d9-6aa2-46c0-be51-142fda9d8c6b","Type":"ContainerDied","Data":"4458935a6abdce07c7aacd7d4935be0cf65e4a6099e8e05366d734498d67cc3c"} Apr 17 16:39:22.629074 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.629037 2572 generic.go:358] "Generic (PLEG): container finished" podID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerID="3413f700c60669056f3e2b8f87250e82cf567c06f533fde3d1864c9eebeda49d" exitCode=0 Apr 17 16:39:22.629437 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.629135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" event={"ID":"c97e7982-9471-4e35-8802-3d643a4d14a5","Type":"ContainerDied","Data":"3413f700c60669056f3e2b8f87250e82cf567c06f533fde3d1864c9eebeda49d"} Apr 17 16:39:22.768215 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.768193 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:22.802093 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.802063 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:22.805676 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.805655 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:22.872424 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.872396 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbt56\" (UniqueName: \"kubernetes.io/projected/2419ec7b-d89b-499a-9e49-6eb1027fb271-kube-api-access-hbt56\") pod \"2419ec7b-d89b-499a-9e49-6eb1027fb271\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " Apr 17 16:39:22.872568 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.872432 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-bundle\") pod \"2419ec7b-d89b-499a-9e49-6eb1027fb271\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " Apr 17 16:39:22.872568 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.872447 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-util\") pod \"2419ec7b-d89b-499a-9e49-6eb1027fb271\" (UID: \"2419ec7b-d89b-499a-9e49-6eb1027fb271\") " Apr 17 16:39:22.873040 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.873011 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-bundle" (OuterVolumeSpecName: "bundle") pod "2419ec7b-d89b-499a-9e49-6eb1027fb271" (UID: "2419ec7b-d89b-499a-9e49-6eb1027fb271"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:22.874588 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.874553 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2419ec7b-d89b-499a-9e49-6eb1027fb271-kube-api-access-hbt56" (OuterVolumeSpecName: "kube-api-access-hbt56") pod "2419ec7b-d89b-499a-9e49-6eb1027fb271" (UID: "2419ec7b-d89b-499a-9e49-6eb1027fb271"). InnerVolumeSpecName "kube-api-access-hbt56". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:39:22.877553 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.877516 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-util" (OuterVolumeSpecName: "util") pod "2419ec7b-d89b-499a-9e49-6eb1027fb271" (UID: "2419ec7b-d89b-499a-9e49-6eb1027fb271"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:22.972969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.972940 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-bundle\") pod \"1f33dd23-b02a-476a-adda-e5395e1fff15\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " Apr 17 16:39:22.972969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.972975 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-util\") pod \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " Apr 17 16:39:22.973154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973002 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-util\") pod \"1f33dd23-b02a-476a-adda-e5395e1fff15\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " Apr 17 16:39:22.973154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973027 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvmp\" (UniqueName: \"kubernetes.io/projected/1f33dd23-b02a-476a-adda-e5395e1fff15-kube-api-access-8tvmp\") pod \"1f33dd23-b02a-476a-adda-e5395e1fff15\" (UID: \"1f33dd23-b02a-476a-adda-e5395e1fff15\") " Apr 17 16:39:22.973154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973042 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-bundle\") pod \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " Apr 17 16:39:22.973154 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973079 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skfgl\" (UniqueName: \"kubernetes.io/projected/c84364d9-6aa2-46c0-be51-142fda9d8c6b-kube-api-access-skfgl\") pod \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\" (UID: \"c84364d9-6aa2-46c0-be51-142fda9d8c6b\") " Apr 17 16:39:22.973423 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973336 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hbt56\" (UniqueName: \"kubernetes.io/projected/2419ec7b-d89b-499a-9e49-6eb1027fb271-kube-api-access-hbt56\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:22.973423 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973355 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:22.973423 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973371 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2419ec7b-d89b-499a-9e49-6eb1027fb271-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:22.973646 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973616 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-bundle" (OuterVolumeSpecName: "bundle") pod "c84364d9-6aa2-46c0-be51-142fda9d8c6b" (UID: "c84364d9-6aa2-46c0-be51-142fda9d8c6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:22.973944 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.973742 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-bundle" (OuterVolumeSpecName: "bundle") pod "1f33dd23-b02a-476a-adda-e5395e1fff15" (UID: "1f33dd23-b02a-476a-adda-e5395e1fff15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:22.975325 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.975301 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f33dd23-b02a-476a-adda-e5395e1fff15-kube-api-access-8tvmp" (OuterVolumeSpecName: "kube-api-access-8tvmp") pod "1f33dd23-b02a-476a-adda-e5395e1fff15" (UID: "1f33dd23-b02a-476a-adda-e5395e1fff15"). InnerVolumeSpecName "kube-api-access-8tvmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:39:22.975325 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.975312 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84364d9-6aa2-46c0-be51-142fda9d8c6b-kube-api-access-skfgl" (OuterVolumeSpecName: "kube-api-access-skfgl") pod "c84364d9-6aa2-46c0-be51-142fda9d8c6b" (UID: "c84364d9-6aa2-46c0-be51-142fda9d8c6b"). InnerVolumeSpecName "kube-api-access-skfgl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:39:22.978815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.978796 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-util" (OuterVolumeSpecName: "util") pod "c84364d9-6aa2-46c0-be51-142fda9d8c6b" (UID: "c84364d9-6aa2-46c0-be51-142fda9d8c6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:22.979205 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:22.979188 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-util" (OuterVolumeSpecName: "util") pod "1f33dd23-b02a-476a-adda-e5395e1fff15" (UID: "1f33dd23-b02a-476a-adda-e5395e1fff15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:23.073916 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.073889 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.073916 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.073911 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.073916 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.073920 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f33dd23-b02a-476a-adda-e5395e1fff15-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.074102 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.073928 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8tvmp\" (UniqueName: \"kubernetes.io/projected/1f33dd23-b02a-476a-adda-e5395e1fff15-kube-api-access-8tvmp\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.074102 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.073939 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c84364d9-6aa2-46c0-be51-142fda9d8c6b-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.074102 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.073948 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skfgl\" (UniqueName: \"kubernetes.io/projected/c84364d9-6aa2-46c0-be51-142fda9d8c6b-kube-api-access-skfgl\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.635515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.635487 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" Apr 17 16:39:23.635916 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.635484 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503qhv2d" event={"ID":"2419ec7b-d89b-499a-9e49-6eb1027fb271","Type":"ContainerDied","Data":"695f708d478b91351c7771dc3b1f1b982e96e79fed625123c96677172adec8d3"} Apr 17 16:39:23.635916 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.635594 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="695f708d478b91351c7771dc3b1f1b982e96e79fed625123c96677172adec8d3" Apr 17 16:39:23.637173 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.637152 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" event={"ID":"c84364d9-6aa2-46c0-be51-142fda9d8c6b","Type":"ContainerDied","Data":"85a2df2b6995e62ebf7a306595b9aef23ccfdcf6750ab8c41c22b2dbf32d0379"} Apr 17 16:39:23.637292 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.637175 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85a2df2b6995e62ebf7a306595b9aef23ccfdcf6750ab8c41c22b2dbf32d0379" Apr 17 16:39:23.637292 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.637184 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30j5j8s" Apr 17 16:39:23.638882 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.638864 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" Apr 17 16:39:23.638882 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.638860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgnph2" event={"ID":"1f33dd23-b02a-476a-adda-e5395e1fff15","Type":"ContainerDied","Data":"a428af485593e32fcda7f3fb6f33ddd1fd366e9e696cf60c1fbc27ff33c1a386"} Apr 17 16:39:23.639028 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.638896 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a428af485593e32fcda7f3fb6f33ddd1fd366e9e696cf60c1fbc27ff33c1a386" Apr 17 16:39:23.763422 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.763402 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:23.880694 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.880665 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-bundle\") pod \"c97e7982-9471-4e35-8802-3d643a4d14a5\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " Apr 17 16:39:23.880870 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.880781 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-util\") pod \"c97e7982-9471-4e35-8802-3d643a4d14a5\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " Apr 17 16:39:23.880870 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.880813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spb6v\" (UniqueName: \"kubernetes.io/projected/c97e7982-9471-4e35-8802-3d643a4d14a5-kube-api-access-spb6v\") pod \"c97e7982-9471-4e35-8802-3d643a4d14a5\" (UID: \"c97e7982-9471-4e35-8802-3d643a4d14a5\") " Apr 17 16:39:23.881219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.881186 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-bundle" (OuterVolumeSpecName: "bundle") pod "c97e7982-9471-4e35-8802-3d643a4d14a5" (UID: "c97e7982-9471-4e35-8802-3d643a4d14a5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:23.883026 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.882998 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97e7982-9471-4e35-8802-3d643a4d14a5-kube-api-access-spb6v" (OuterVolumeSpecName: "kube-api-access-spb6v") pod "c97e7982-9471-4e35-8802-3d643a4d14a5" (UID: "c97e7982-9471-4e35-8802-3d643a4d14a5"). InnerVolumeSpecName "kube-api-access-spb6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:39:23.885855 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.885803 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-util" (OuterVolumeSpecName: "util") pod "c97e7982-9471-4e35-8802-3d643a4d14a5" (UID: "c97e7982-9471-4e35-8802-3d643a4d14a5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:39:23.981367 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.981345 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.981367 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.981368 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c97e7982-9471-4e35-8802-3d643a4d14a5-util\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:23.981500 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:23.981379 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spb6v\" (UniqueName: \"kubernetes.io/projected/c97e7982-9471-4e35-8802-3d643a4d14a5-kube-api-access-spb6v\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:39:24.644005 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:24.643979 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" Apr 17 16:39:24.644392 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:24.643978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88dh6hg" event={"ID":"c97e7982-9471-4e35-8802-3d643a4d14a5","Type":"ContainerDied","Data":"eafcf3fbda038447ad0f32a0161ffdd36c320923e30a418847b8ee5751bd1f83"} Apr 17 16:39:24.644392 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:24.644087 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eafcf3fbda038447ad0f32a0161ffdd36c320923e30a418847b8ee5751bd1f83" Apr 17 16:39:44.374118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374087 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw"] Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374397 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374408 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374419 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374425 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374433 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374439 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374446 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374451 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374456 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374462 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374469 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374474 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374489 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374494 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374499 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374504 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="util" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374508 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374514 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374520 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374525 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374531 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374537 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="pull" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374542 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="extract" Apr 17 16:39:44.374540 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374548 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="extract" Apr 17 16:39:44.375248 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374593 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f33dd23-b02a-476a-adda-e5395e1fff15" containerName="extract" Apr 17 16:39:44.375248 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374601 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c97e7982-9471-4e35-8802-3d643a4d14a5" containerName="extract" Apr 17 16:39:44.375248 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374609 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2419ec7b-d89b-499a-9e49-6eb1027fb271" containerName="extract" Apr 17 16:39:44.375248 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.374616 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c84364d9-6aa2-46c0-be51-142fda9d8c6b" containerName="extract" Apr 17 16:39:44.379112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.379089 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.381741 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.381696 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 16:39:44.381871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.381785 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8qclv\"" Apr 17 16:39:44.382481 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.382455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 16:39:44.382594 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.382494 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 16:39:44.382594 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.382514 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 16:39:44.393296 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.393276 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw"] Apr 17 16:39:44.437096 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.437063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf74\" (UniqueName: \"kubernetes.io/projected/7923ca69-0185-41c6-84a2-64e801337d1d-kube-api-access-zjf74\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.437246 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.437108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7923ca69-0185-41c6-84a2-64e801337d1d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.437246 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.437218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7923ca69-0185-41c6-84a2-64e801337d1d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.537797 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.537765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7923ca69-0185-41c6-84a2-64e801337d1d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.537964 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.537861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf74\" (UniqueName: \"kubernetes.io/projected/7923ca69-0185-41c6-84a2-64e801337d1d-kube-api-access-zjf74\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.537964 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.537887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7923ca69-0185-41c6-84a2-64e801337d1d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.537964 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:39:44.537914 2572 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 16:39:44.538131 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:39:44.537987 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7923ca69-0185-41c6-84a2-64e801337d1d-plugin-serving-cert podName:7923ca69-0185-41c6-84a2-64e801337d1d nodeName:}" failed. No retries permitted until 2026-04-17 16:39:45.037967229 +0000 UTC m=+507.625556368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/7923ca69-0185-41c6-84a2-64e801337d1d-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-4mfhw" (UID: "7923ca69-0185-41c6-84a2-64e801337d1d") : secret "plugin-serving-cert" not found Apr 17 16:39:44.538520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.538498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7923ca69-0185-41c6-84a2-64e801337d1d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:44.549763 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:44.549743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf74\" (UniqueName: \"kubernetes.io/projected/7923ca69-0185-41c6-84a2-64e801337d1d-kube-api-access-zjf74\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:45.042323 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:45.042271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7923ca69-0185-41c6-84a2-64e801337d1d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:45.044806 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:45.044779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7923ca69-0185-41c6-84a2-64e801337d1d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4mfhw\" (UID: \"7923ca69-0185-41c6-84a2-64e801337d1d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:45.289678 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:45.289642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" Apr 17 16:39:45.422480 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:45.422447 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw"] Apr 17 16:39:45.425473 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:39:45.425438 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7923ca69_0185_41c6_84a2_64e801337d1d.slice/crio-a5ecfb41e90f138c48985fa1f6530767c7c5e31c52a81fc60f83f5e6fabcf0d1 WatchSource:0}: Error finding container a5ecfb41e90f138c48985fa1f6530767c7c5e31c52a81fc60f83f5e6fabcf0d1: Status 404 returned error can't find the container with id a5ecfb41e90f138c48985fa1f6530767c7c5e31c52a81fc60f83f5e6fabcf0d1 Apr 17 16:39:45.723694 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:45.723653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" event={"ID":"7923ca69-0185-41c6-84a2-64e801337d1d","Type":"ContainerStarted","Data":"a5ecfb41e90f138c48985fa1f6530767c7c5e31c52a81fc60f83f5e6fabcf0d1"} Apr 17 16:39:50.745513 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:39:50.745478 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" event={"ID":"7923ca69-0185-41c6-84a2-64e801337d1d","Type":"ContainerStarted","Data":"cb014b4393e9b360b9bc9f1747b78ae3369ce3b5a03f782a711d3cfd0e5d66db"} Apr 17 16:40:23.143688 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.143580 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4mfhw" podStartSLOduration=34.613273953 podStartE2EDuration="39.143563879s" podCreationTimestamp="2026-04-17 16:39:44 +0000 UTC" firstStartedPulling="2026-04-17 16:39:45.426882531 +0000 UTC m=+508.014471482" lastFinishedPulling="2026-04-17 16:39:49.957172458 +0000 UTC m=+512.544761408" observedRunningTime="2026-04-17 16:39:50.778338183 +0000 UTC m=+513.365927153" watchObservedRunningTime="2026-04-17 16:40:23.143563879 +0000 UTC m=+545.731152924" Apr 17 16:40:23.144283 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.143745 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79d89b468f-t84r9"] Apr 17 16:40:23.169330 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.169299 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79d89b468f-t84r9"] Apr 17 16:40:23.169482 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.169424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265194 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265161 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-trusted-ca-bundle\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-config\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-service-ca\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265450 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-oauth-config\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265450 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-oauth-serving-cert\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265450 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9v25\" (UniqueName: \"kubernetes.io/projected/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-kube-api-access-x9v25\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.265450 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.265432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-serving-cert\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366286 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-oauth-config\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366286 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-oauth-serving-cert\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366319 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9v25\" (UniqueName: \"kubernetes.io/projected/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-kube-api-access-x9v25\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-serving-cert\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366381 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-trusted-ca-bundle\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-config\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.366495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.366467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-service-ca\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.367416 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.367374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-oauth-serving-cert\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.367563 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.367476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-trusted-ca-bundle\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.367895 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.367869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-config\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.368179 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.368145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-service-ca\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.369812 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.369781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-serving-cert\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.375335 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.375312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-console-oauth-config\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.377930 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.377895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9v25\" (UniqueName: \"kubernetes.io/projected/fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8-kube-api-access-x9v25\") pod \"console-79d89b468f-t84r9\" (UID: \"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8\") " pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.478655 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.478590 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:23.607976 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.607953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79d89b468f-t84r9"] Apr 17 16:40:23.609952 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:40:23.609913 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0c0d7a_e38d_4dac_a3cc_dbddcbbcfec8.slice/crio-cf10f03cf9ac9703b8f6a80e50dcb3e8f9e71ecb9a5ec30fad05080860edf5ba WatchSource:0}: Error finding container cf10f03cf9ac9703b8f6a80e50dcb3e8f9e71ecb9a5ec30fad05080860edf5ba: Status 404 returned error can't find the container with id cf10f03cf9ac9703b8f6a80e50dcb3e8f9e71ecb9a5ec30fad05080860edf5ba Apr 17 16:40:23.869839 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.869805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d89b468f-t84r9" event={"ID":"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8","Type":"ContainerStarted","Data":"f104bd5f4aae8c6260570923f8b24e1ce6679c01cf8312e3e43b6dc4c71c8219"} Apr 17 16:40:23.869839 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.869842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79d89b468f-t84r9" event={"ID":"fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8","Type":"ContainerStarted","Data":"cf10f03cf9ac9703b8f6a80e50dcb3e8f9e71ecb9a5ec30fad05080860edf5ba"} Apr 17 16:40:23.887663 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:23.887594 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79d89b468f-t84r9" podStartSLOduration=0.887580544 podStartE2EDuration="887.580544ms" podCreationTimestamp="2026-04-17 16:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:40:23.886526975 +0000 UTC m=+546.474115947" watchObservedRunningTime="2026-04-17 16:40:23.887580544 +0000 UTC m=+546.475169515" Apr 17 16:40:28.963910 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:28.963876 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw4zx"] Apr 17 16:40:28.967419 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:28.967402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:28.969678 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:28.969651 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 16:40:28.975010 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:28.974709 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw4zx"] Apr 17 16:40:29.000553 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.000527 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw4zx"] Apr 17 16:40:29.113755 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.113697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwdp\" (UniqueName: \"kubernetes.io/projected/94060e29-bfc5-4851-8bd6-bc14a5f01acc-kube-api-access-dfwdp\") pod \"limitador-limitador-67566c68b4-nw4zx\" (UID: \"94060e29-bfc5-4851-8bd6-bc14a5f01acc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.113906 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.113833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/94060e29-bfc5-4851-8bd6-bc14a5f01acc-config-file\") pod \"limitador-limitador-67566c68b4-nw4zx\" (UID: \"94060e29-bfc5-4851-8bd6-bc14a5f01acc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.214277 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.214202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/94060e29-bfc5-4851-8bd6-bc14a5f01acc-config-file\") pod \"limitador-limitador-67566c68b4-nw4zx\" (UID: \"94060e29-bfc5-4851-8bd6-bc14a5f01acc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.214277 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.214255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwdp\" (UniqueName: \"kubernetes.io/projected/94060e29-bfc5-4851-8bd6-bc14a5f01acc-kube-api-access-dfwdp\") pod \"limitador-limitador-67566c68b4-nw4zx\" (UID: \"94060e29-bfc5-4851-8bd6-bc14a5f01acc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.214818 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.214799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/94060e29-bfc5-4851-8bd6-bc14a5f01acc-config-file\") pod \"limitador-limitador-67566c68b4-nw4zx\" (UID: \"94060e29-bfc5-4851-8bd6-bc14a5f01acc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.222128 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.222106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwdp\" (UniqueName: \"kubernetes.io/projected/94060e29-bfc5-4851-8bd6-bc14a5f01acc-kube-api-access-dfwdp\") pod \"limitador-limitador-67566c68b4-nw4zx\" (UID: \"94060e29-bfc5-4851-8bd6-bc14a5f01acc\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.279124 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.279096 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:29.402347 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.402318 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw4zx"] Apr 17 16:40:29.404616 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:40:29.404582 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94060e29_bfc5_4851_8bd6_bc14a5f01acc.slice/crio-6096276fcb026f3b2b6db3c938803c447345d3febcf8762b1436a96e79973b22 WatchSource:0}: Error finding container 6096276fcb026f3b2b6db3c938803c447345d3febcf8762b1436a96e79973b22: Status 404 returned error can't find the container with id 6096276fcb026f3b2b6db3c938803c447345d3febcf8762b1436a96e79973b22 Apr 17 16:40:29.893739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:29.893693 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" event={"ID":"94060e29-bfc5-4851-8bd6-bc14a5f01acc","Type":"ContainerStarted","Data":"6096276fcb026f3b2b6db3c938803c447345d3febcf8762b1436a96e79973b22"} Apr 17 16:40:30.898849 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:30.898819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" event={"ID":"94060e29-bfc5-4851-8bd6-bc14a5f01acc","Type":"ContainerStarted","Data":"95d50871e558b6ef2bf47e1629395ffb3b1916adc2f82b21702ba38cd4f8fcdd"} Apr 17 16:40:30.899213 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:30.898932 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:30.915730 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:30.915668 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" podStartSLOduration=1.7458706529999999 podStartE2EDuration="2.915656415s" podCreationTimestamp="2026-04-17 16:40:28 +0000 UTC" firstStartedPulling="2026-04-17 16:40:29.406488951 +0000 UTC m=+551.994077898" lastFinishedPulling="2026-04-17 16:40:30.576274703 +0000 UTC m=+553.163863660" observedRunningTime="2026-04-17 16:40:30.914920588 +0000 UTC m=+553.502509559" watchObservedRunningTime="2026-04-17 16:40:30.915656415 +0000 UTC m=+553.503245385" Apr 17 16:40:33.479608 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:33.479576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:33.480043 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:33.479651 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:33.484104 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:33.484085 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:33.913545 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:33.913516 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79d89b468f-t84r9" Apr 17 16:40:33.954832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:33.954801 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8546bb5cc6-5hpzq"] Apr 17 16:40:41.904090 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:41.904063 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-nw4zx" Apr 17 16:40:58.976436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:58.976375 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8546bb5cc6-5hpzq" podUID="f4bf6cc0-f391-4da0-8c81-f7bf071794e4" containerName="console" containerID="cri-o://048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105" gracePeriod=15 Apr 17 16:40:59.219772 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.219752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8546bb5cc6-5hpzq_f4bf6cc0-f391-4da0-8c81-f7bf071794e4/console/0.log" Apr 17 16:40:59.219881 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.219815 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:40:59.253900 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.253831 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsrrs\" (UniqueName: \"kubernetes.io/projected/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-kube-api-access-nsrrs\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254045 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.253907 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-service-ca\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254045 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.253940 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-serving-cert\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254045 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.253979 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-trusted-ca-bundle\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254045 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254016 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-config\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254045 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254040 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-oauth-config\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254321 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254102 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-oauth-serving-cert\") pod \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\" (UID: \"f4bf6cc0-f391-4da0-8c81-f7bf071794e4\") " Apr 17 16:40:59.254380 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254315 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-service-ca" (OuterVolumeSpecName: "service-ca") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:40:59.254441 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254389 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:40:59.254560 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254541 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-trusted-ca-bundle\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:40:59.254630 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254565 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-service-ca\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:40:59.254689 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.254650 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-config" (OuterVolumeSpecName: "console-config") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:40:59.255044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.255015 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:40:59.256359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.256326 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:40:59.256517 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.256500 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-kube-api-access-nsrrs" (OuterVolumeSpecName: "kube-api-access-nsrrs") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "kube-api-access-nsrrs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:40:59.256618 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.256604 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f4bf6cc0-f391-4da0-8c81-f7bf071794e4" (UID: "f4bf6cc0-f391-4da0-8c81-f7bf071794e4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:40:59.355617 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.355583 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:40:59.355617 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.355611 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-oauth-config\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:40:59.355617 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.355621 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-oauth-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:40:59.355863 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.355630 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsrrs\" (UniqueName: \"kubernetes.io/projected/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-kube-api-access-nsrrs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:40:59.355863 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:40:59.355639 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bf6cc0-f391-4da0-8c81-f7bf071794e4-console-serving-cert\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:41:00.005672 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.005643 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8546bb5cc6-5hpzq_f4bf6cc0-f391-4da0-8c81-f7bf071794e4/console/0.log" Apr 17 16:41:00.006067 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.005687 2572 generic.go:358] "Generic (PLEG): container finished" podID="f4bf6cc0-f391-4da0-8c81-f7bf071794e4" containerID="048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105" exitCode=2 Apr 17 16:41:00.006067 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.005770 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8546bb5cc6-5hpzq" Apr 17 16:41:00.006067 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.005780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8546bb5cc6-5hpzq" event={"ID":"f4bf6cc0-f391-4da0-8c81-f7bf071794e4","Type":"ContainerDied","Data":"048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105"} Apr 17 16:41:00.006067 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.005818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8546bb5cc6-5hpzq" event={"ID":"f4bf6cc0-f391-4da0-8c81-f7bf071794e4","Type":"ContainerDied","Data":"c896ab01f2188e2277072777f3e91afcc08f251be851112668766015aa9a4fc7"} Apr 17 16:41:00.006067 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.005832 2572 scope.go:117] "RemoveContainer" containerID="048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105" Apr 17 16:41:00.014762 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.014742 2572 scope.go:117] "RemoveContainer" containerID="048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105" Apr 17 16:41:00.015038 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:41:00.015019 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105\": container with ID starting with 048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105 not found: ID does not exist" containerID="048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105" Apr 17 16:41:00.015118 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.015046 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105"} err="failed to get container status \"048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105\": rpc error: code = NotFound desc = could not find container \"048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105\": container with ID starting with 048872c40d607d005986941acc3891dc070df6e8c013956d588a154d79dec105 not found: ID does not exist" Apr 17 16:41:00.022301 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.022274 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8546bb5cc6-5hpzq"] Apr 17 16:41:00.025355 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:00.025324 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8546bb5cc6-5hpzq"] Apr 17 16:41:01.973604 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:01.973569 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4bf6cc0-f391-4da0-8c81-f7bf071794e4" path="/var/lib/kubelet/pods/f4bf6cc0-f391-4da0-8c81-f7bf071794e4/volumes" Apr 17 16:41:17.868312 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:17.868285 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:41:17.869183 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:17.869161 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:41:17.873365 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:17.873338 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:41:17.874206 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:41:17.874186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:42:36.070140 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.070100 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-pkbzr"] Apr 17 16:42:36.070707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.070611 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4bf6cc0-f391-4da0-8c81-f7bf071794e4" containerName="console" Apr 17 16:42:36.070707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.070631 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bf6cc0-f391-4da0-8c81-f7bf071794e4" containerName="console" Apr 17 16:42:36.070859 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.070761 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4bf6cc0-f391-4da0-8c81-f7bf071794e4" containerName="console" Apr 17 16:42:36.073777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.073757 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pkbzr" Apr 17 16:42:36.076478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.076267 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:42:36.076478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.076267 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:42:36.076478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.076327 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:42:36.076478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.076473 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lwlxq\"" Apr 17 16:42:36.080030 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.080007 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pkbzr"] Apr 17 16:42:36.161814 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.161779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kmt\" (UniqueName: \"kubernetes.io/projected/eba3d1d6-340f-40f0-a1fe-699dc39bfa1c-kube-api-access-x8kmt\") pod \"s3-init-pkbzr\" (UID: \"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c\") " pod="kserve/s3-init-pkbzr" Apr 17 16:42:36.262391 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.262359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kmt\" (UniqueName: \"kubernetes.io/projected/eba3d1d6-340f-40f0-a1fe-699dc39bfa1c-kube-api-access-x8kmt\") pod \"s3-init-pkbzr\" (UID: \"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c\") " pod="kserve/s3-init-pkbzr" Apr 17 16:42:36.271616 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.271582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kmt\" (UniqueName: \"kubernetes.io/projected/eba3d1d6-340f-40f0-a1fe-699dc39bfa1c-kube-api-access-x8kmt\") pod \"s3-init-pkbzr\" (UID: \"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c\") " pod="kserve/s3-init-pkbzr" Apr 17 16:42:36.384084 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.384060 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pkbzr" Apr 17 16:42:36.510806 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.510781 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pkbzr"] Apr 17 16:42:36.512770 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:42:36.512710 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba3d1d6_340f_40f0_a1fe_699dc39bfa1c.slice/crio-33f8393f5debdeb32fc761493a05dc747685c4a1e9b8b1c208cb0dbaebfef92c WatchSource:0}: Error finding container 33f8393f5debdeb32fc761493a05dc747685c4a1e9b8b1c208cb0dbaebfef92c: Status 404 returned error can't find the container with id 33f8393f5debdeb32fc761493a05dc747685c4a1e9b8b1c208cb0dbaebfef92c Apr 17 16:42:36.514522 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:36.514506 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:42:37.374496 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:37.374453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pkbzr" event={"ID":"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c","Type":"ContainerStarted","Data":"33f8393f5debdeb32fc761493a05dc747685c4a1e9b8b1c208cb0dbaebfef92c"} Apr 17 16:42:41.393294 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:41.393258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pkbzr" event={"ID":"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c","Type":"ContainerStarted","Data":"7834aa2c63c194f428eb8bf48f05295a5a2d57456d03b29c6dfc2b1809f60f6c"} Apr 17 16:42:41.410409 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:41.410364 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-pkbzr" podStartSLOduration=0.871815485 podStartE2EDuration="5.410350928s" podCreationTimestamp="2026-04-17 16:42:36 +0000 UTC" firstStartedPulling="2026-04-17 16:42:36.514639575 +0000 UTC m=+679.102228523" lastFinishedPulling="2026-04-17 16:42:41.053175004 +0000 UTC m=+683.640763966" observedRunningTime="2026-04-17 16:42:41.409623775 +0000 UTC m=+683.997212748" watchObservedRunningTime="2026-04-17 16:42:41.410350928 +0000 UTC m=+683.997939900" Apr 17 16:42:44.406353 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:44.406322 2572 generic.go:358] "Generic (PLEG): container finished" podID="eba3d1d6-340f-40f0-a1fe-699dc39bfa1c" containerID="7834aa2c63c194f428eb8bf48f05295a5a2d57456d03b29c6dfc2b1809f60f6c" exitCode=0 Apr 17 16:42:44.406699 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:44.406394 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pkbzr" event={"ID":"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c","Type":"ContainerDied","Data":"7834aa2c63c194f428eb8bf48f05295a5a2d57456d03b29c6dfc2b1809f60f6c"} Apr 17 16:42:45.547860 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:45.547834 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pkbzr" Apr 17 16:42:45.643160 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:45.643133 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kmt\" (UniqueName: \"kubernetes.io/projected/eba3d1d6-340f-40f0-a1fe-699dc39bfa1c-kube-api-access-x8kmt\") pod \"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c\" (UID: \"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c\") " Apr 17 16:42:45.645383 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:45.645355 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba3d1d6-340f-40f0-a1fe-699dc39bfa1c-kube-api-access-x8kmt" (OuterVolumeSpecName: "kube-api-access-x8kmt") pod "eba3d1d6-340f-40f0-a1fe-699dc39bfa1c" (UID: "eba3d1d6-340f-40f0-a1fe-699dc39bfa1c"). InnerVolumeSpecName "kube-api-access-x8kmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:42:45.744807 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:45.744734 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8kmt\" (UniqueName: \"kubernetes.io/projected/eba3d1d6-340f-40f0-a1fe-699dc39bfa1c-kube-api-access-x8kmt\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:42:46.420147 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:46.420121 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pkbzr" Apr 17 16:42:46.420318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:46.420123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pkbzr" event={"ID":"eba3d1d6-340f-40f0-a1fe-699dc39bfa1c","Type":"ContainerDied","Data":"33f8393f5debdeb32fc761493a05dc747685c4a1e9b8b1c208cb0dbaebfef92c"} Apr 17 16:42:46.420318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:42:46.420235 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f8393f5debdeb32fc761493a05dc747685c4a1e9b8b1c208cb0dbaebfef92c" Apr 17 16:43:21.855523 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.855435 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl"] Apr 17 16:43:21.856355 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.855780 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eba3d1d6-340f-40f0-a1fe-699dc39bfa1c" containerName="s3-init" Apr 17 16:43:21.856355 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.855792 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba3d1d6-340f-40f0-a1fe-699dc39bfa1c" containerName="s3-init" Apr 17 16:43:21.856355 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.855856 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="eba3d1d6-340f-40f0-a1fe-699dc39bfa1c" containerName="s3-init" Apr 17 16:43:21.890879 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.890853 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl"] Apr 17 16:43:21.891033 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.890969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:21.893187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.893165 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 17 16:43:21.893309 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.893197 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:43:21.893865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.893846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:43:21.894009 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:21.893985 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:43:22.050071 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.050037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08a922f1-09f3-4938-8c5b-4afde116929c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.050240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.050085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-dshm\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.050240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.050106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.050240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.050128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.050240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.050165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmb4c\" (UniqueName: \"kubernetes.io/projected/08a922f1-09f3-4938-8c5b-4afde116929c-kube-api-access-pmb4c\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.050240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.050193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-home\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151500 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08a922f1-09f3-4938-8c5b-4afde116929c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151500 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-dshm\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151500 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151706 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151706 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmb4c\" (UniqueName: \"kubernetes.io/projected/08a922f1-09f3-4938-8c5b-4afde116929c-kube-api-access-pmb4c\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151706 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-home\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.151988 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151950 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.152056 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.151987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-model-cache\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.152056 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.152003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-home\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.153883 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.153867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-dshm\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.154240 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.154224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08a922f1-09f3-4938-8c5b-4afde116929c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.166327 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.166305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmb4c\" (UniqueName: \"kubernetes.io/projected/08a922f1-09f3-4938-8c5b-4afde116929c-kube-api-access-pmb4c\") pod \"scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.201373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.201341 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:22.325615 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.325592 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl"] Apr 17 16:43:22.327371 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:43:22.327344 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a922f1_09f3_4938_8c5b_4afde116929c.slice/crio-57597f4eb635ff3b90b0da8c044f3b4eeba9b72abdce296e3a84874cf7a63481 WatchSource:0}: Error finding container 57597f4eb635ff3b90b0da8c044f3b4eeba9b72abdce296e3a84874cf7a63481: Status 404 returned error can't find the container with id 57597f4eb635ff3b90b0da8c044f3b4eeba9b72abdce296e3a84874cf7a63481 Apr 17 16:43:22.562818 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:22.562731 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" event={"ID":"08a922f1-09f3-4938-8c5b-4afde116929c","Type":"ContainerStarted","Data":"57597f4eb635ff3b90b0da8c044f3b4eeba9b72abdce296e3a84874cf7a63481"} Apr 17 16:43:26.582847 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:26.582787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" event={"ID":"08a922f1-09f3-4938-8c5b-4afde116929c","Type":"ContainerStarted","Data":"92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d"} Apr 17 16:43:29.598497 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:29.598465 2572 generic.go:358] "Generic (PLEG): container finished" podID="08a922f1-09f3-4938-8c5b-4afde116929c" containerID="92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d" exitCode=0 Apr 17 16:43:29.598913 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:29.598537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" event={"ID":"08a922f1-09f3-4938-8c5b-4afde116929c","Type":"ContainerDied","Data":"92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d"} Apr 17 16:43:31.607676 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:31.607637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" event={"ID":"08a922f1-09f3-4938-8c5b-4afde116929c","Type":"ContainerStarted","Data":"79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487"} Apr 17 16:43:31.628074 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:31.628022 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" podStartSLOduration=2.313698432 podStartE2EDuration="10.628009762s" podCreationTimestamp="2026-04-17 16:43:21 +0000 UTC" firstStartedPulling="2026-04-17 16:43:22.329298215 +0000 UTC m=+724.916887163" lastFinishedPulling="2026-04-17 16:43:30.64360953 +0000 UTC m=+733.231198493" observedRunningTime="2026-04-17 16:43:31.6250039 +0000 UTC m=+734.212592871" watchObservedRunningTime="2026-04-17 16:43:31.628009762 +0000 UTC m=+734.215598732" Apr 17 16:43:32.202196 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:32.202167 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:32.202367 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:32.202214 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:32.214641 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:32.214618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:43:32.623187 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:43:32.623156 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:44:13.697518 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:13.697485 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl"] Apr 17 16:44:13.698147 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:13.697793 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" containerName="main" containerID="cri-o://79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487" gracePeriod=30 Apr 17 16:44:13.938922 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:13.938900 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:44:14.096654 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096627 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-home\") pod \"08a922f1-09f3-4938-8c5b-4afde116929c\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " Apr 17 16:44:14.096846 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096680 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08a922f1-09f3-4938-8c5b-4afde116929c-tls-certs\") pod \"08a922f1-09f3-4938-8c5b-4afde116929c\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " Apr 17 16:44:14.096846 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096704 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-model-cache\") pod \"08a922f1-09f3-4938-8c5b-4afde116929c\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " Apr 17 16:44:14.096846 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096747 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-dshm\") pod \"08a922f1-09f3-4938-8c5b-4afde116929c\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " Apr 17 16:44:14.096846 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096770 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmb4c\" (UniqueName: \"kubernetes.io/projected/08a922f1-09f3-4938-8c5b-4afde116929c-kube-api-access-pmb4c\") pod \"08a922f1-09f3-4938-8c5b-4afde116929c\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " Apr 17 16:44:14.097062 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096845 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-kserve-provision-location\") pod \"08a922f1-09f3-4938-8c5b-4afde116929c\" (UID: \"08a922f1-09f3-4938-8c5b-4afde116929c\") " Apr 17 16:44:14.097062 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-model-cache" (OuterVolumeSpecName: "model-cache") pod "08a922f1-09f3-4938-8c5b-4afde116929c" (UID: "08a922f1-09f3-4938-8c5b-4afde116929c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:14.097062 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.096995 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-home" (OuterVolumeSpecName: "home") pod "08a922f1-09f3-4938-8c5b-4afde116929c" (UID: "08a922f1-09f3-4938-8c5b-4afde116929c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:14.097219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.097171 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:44:14.097219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.097191 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:44:14.099178 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.099139 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a922f1-09f3-4938-8c5b-4afde116929c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "08a922f1-09f3-4938-8c5b-4afde116929c" (UID: "08a922f1-09f3-4938-8c5b-4afde116929c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:44:14.100077 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.100046 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a922f1-09f3-4938-8c5b-4afde116929c-kube-api-access-pmb4c" (OuterVolumeSpecName: "kube-api-access-pmb4c") pod "08a922f1-09f3-4938-8c5b-4afde116929c" (UID: "08a922f1-09f3-4938-8c5b-4afde116929c"). InnerVolumeSpecName "kube-api-access-pmb4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:44:14.102616 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.102583 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-dshm" (OuterVolumeSpecName: "dshm") pod "08a922f1-09f3-4938-8c5b-4afde116929c" (UID: "08a922f1-09f3-4938-8c5b-4afde116929c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:14.152373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.152340 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08a922f1-09f3-4938-8c5b-4afde116929c" (UID: "08a922f1-09f3-4938-8c5b-4afde116929c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:14.198601 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.198571 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08a922f1-09f3-4938-8c5b-4afde116929c-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:44:14.198601 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.198597 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:44:14.198783 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.198606 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pmb4c\" (UniqueName: \"kubernetes.io/projected/08a922f1-09f3-4938-8c5b-4afde116929c-kube-api-access-pmb4c\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:44:14.198783 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.198615 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08a922f1-09f3-4938-8c5b-4afde116929c-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:44:14.775123 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.775089 2572 generic.go:358] "Generic (PLEG): container finished" podID="08a922f1-09f3-4938-8c5b-4afde116929c" containerID="79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487" exitCode=0 Apr 17 16:44:14.775582 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.775177 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" Apr 17 16:44:14.775582 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.775175 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" event={"ID":"08a922f1-09f3-4938-8c5b-4afde116929c","Type":"ContainerDied","Data":"79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487"} Apr 17 16:44:14.775582 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.775217 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl" event={"ID":"08a922f1-09f3-4938-8c5b-4afde116929c","Type":"ContainerDied","Data":"57597f4eb635ff3b90b0da8c044f3b4eeba9b72abdce296e3a84874cf7a63481"} Apr 17 16:44:14.775582 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.775233 2572 scope.go:117] "RemoveContainer" containerID="79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487" Apr 17 16:44:14.784400 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.784381 2572 scope.go:117] "RemoveContainer" containerID="92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d" Apr 17 16:44:14.798558 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.798531 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl"] Apr 17 16:44:14.801691 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.801586 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-86d7b9f65d-xc6dl"] Apr 17 16:44:14.801755 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.801738 2572 scope.go:117] "RemoveContainer" containerID="79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487" Apr 17 16:44:14.801988 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:44:14.801970 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487\": container with ID starting with 79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487 not found: ID does not exist" containerID="79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487" Apr 17 16:44:14.802026 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.801996 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487"} err="failed to get container status \"79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487\": rpc error: code = NotFound desc = could not find container \"79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487\": container with ID starting with 79297f5cb60b7185a23d37019f3fd6a3b2917833ef58efe35b5f835e425c4487 not found: ID does not exist" Apr 17 16:44:14.802026 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.802014 2572 scope.go:117] "RemoveContainer" containerID="92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d" Apr 17 16:44:14.802225 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:44:14.802209 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d\": container with ID starting with 92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d not found: ID does not exist" containerID="92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d" Apr 17 16:44:14.802264 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:14.802232 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d"} err="failed to get container status \"92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d\": rpc error: code = NotFound desc = could not find container \"92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d\": container with ID starting with 92d06a2983f46e7c9bc1858ac5fb5af5ab69650f0ee62f43b21bf166b178f15d not found: ID does not exist" Apr 17 16:44:15.973608 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:15.973571 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" path="/var/lib/kubelet/pods/08a922f1-09f3-4938-8c5b-4afde116929c/volumes" Apr 17 16:44:28.977408 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.977374 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd"] Apr 17 16:44:28.977796 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.977704 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" containerName="storage-initializer" Apr 17 16:44:28.977796 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.977729 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" containerName="storage-initializer" Apr 17 16:44:28.977796 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.977738 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" containerName="main" Apr 17 16:44:28.977796 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.977744 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" containerName="main" Apr 17 16:44:28.977934 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.977815 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="08a922f1-09f3-4938-8c5b-4afde116929c" containerName="main" Apr 17 16:44:28.983009 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.982990 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:28.985992 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.985968 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 16:44:28.986112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.986012 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:44:28.986112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.986023 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:44:28.986112 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.985974 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:44:28.992269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:28.992245 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd"] Apr 17 16:44:29.031602 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.031575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptwt\" (UniqueName: \"kubernetes.io/projected/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kube-api-access-8ptwt\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.031746 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.031613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.031746 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.031637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-home\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.031746 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.031683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-model-cache\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.031746 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.031700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.031888 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.031847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-dshm\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132477 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-model-cache\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132639 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132639 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-dshm\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132639 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptwt\" (UniqueName: \"kubernetes.io/projected/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kube-api-access-8ptwt\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132639 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132882 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-home\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.132882 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.132854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-model-cache\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.133075 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.133054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.133134 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.133075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-home\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.134923 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.134902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-dshm\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.135235 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.135220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.140488 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.140470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptwt\" (UniqueName: \"kubernetes.io/projected/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kube-api-access-8ptwt\") pod \"scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.293614 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.293544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:29.414894 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.414864 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd"] Apr 17 16:44:29.418061 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:44:29.418032 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe646b3_2c37_4cb3_8359_1188d47a5d9b.slice/crio-f773b81cc032dc6b1cd269774cca18e45fd0e79f3396072f87a700fb2f5b71e4 WatchSource:0}: Error finding container f773b81cc032dc6b1cd269774cca18e45fd0e79f3396072f87a700fb2f5b71e4: Status 404 returned error can't find the container with id f773b81cc032dc6b1cd269774cca18e45fd0e79f3396072f87a700fb2f5b71e4 Apr 17 16:44:29.839063 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.839017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" event={"ID":"5fe646b3-2c37-4cb3-8359-1188d47a5d9b","Type":"ContainerStarted","Data":"ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8"} Apr 17 16:44:29.839063 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:29.839060 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" event={"ID":"5fe646b3-2c37-4cb3-8359-1188d47a5d9b","Type":"ContainerStarted","Data":"f773b81cc032dc6b1cd269774cca18e45fd0e79f3396072f87a700fb2f5b71e4"} Apr 17 16:44:33.857859 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:33.857826 2572 generic.go:358] "Generic (PLEG): container finished" podID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerID="ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8" exitCode=0 Apr 17 16:44:33.858220 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:33.857871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" event={"ID":"5fe646b3-2c37-4cb3-8359-1188d47a5d9b","Type":"ContainerDied","Data":"ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8"} Apr 17 16:44:34.863669 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:34.863585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" event={"ID":"5fe646b3-2c37-4cb3-8359-1188d47a5d9b","Type":"ContainerStarted","Data":"bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939"} Apr 17 16:44:34.881766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:34.881708 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" podStartSLOduration=6.881695385 podStartE2EDuration="6.881695385s" podCreationTimestamp="2026-04-17 16:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:44:34.880613257 +0000 UTC m=+797.468202227" watchObservedRunningTime="2026-04-17 16:44:34.881695385 +0000 UTC m=+797.469284404" Apr 17 16:44:39.294331 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:39.294299 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:39.294331 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:39.294341 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:39.306912 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:39.306884 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:39.892762 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:39.892734 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:44:47.678050 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.678018 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj"] Apr 17 16:44:47.711777 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.711740 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj"] Apr 17 16:44:47.711946 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.711867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.714617 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.714596 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 17 16:44:47.796946 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.796922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.797114 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.796958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b218a1-63ce-4c38-9d4a-b136556f7052-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.797114 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.797027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpkj\" (UniqueName: \"kubernetes.io/projected/d6b218a1-63ce-4c38-9d4a-b136556f7052-kube-api-access-nnpkj\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.797114 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.797089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.797253 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.797112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.797253 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.797142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.897739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.897685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.897739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.897741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b218a1-63ce-4c38-9d4a-b136556f7052-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.897925 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.897771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpkj\" (UniqueName: \"kubernetes.io/projected/d6b218a1-63ce-4c38-9d4a-b136556f7052-kube-api-access-nnpkj\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.897925 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.897799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.897925 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.897814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.897925 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.897833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.898155 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.898136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.898218 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.898154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.898277 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.898217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.900138 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.900109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.900541 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.900520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b218a1-63ce-4c38-9d4a-b136556f7052-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:47.905453 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:47.905432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpkj\" (UniqueName: \"kubernetes.io/projected/d6b218a1-63ce-4c38-9d4a-b136556f7052-kube-api-access-nnpkj\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:48.022891 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:48.022804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:44:48.160523 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:48.160499 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj"] Apr 17 16:44:48.162150 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:44:48.162127 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b218a1_63ce_4c38_9d4a_b136556f7052.slice/crio-3f20520f2af1daf573aaebb84d943292b1e30164f6317a10c0d156107c72afd5 WatchSource:0}: Error finding container 3f20520f2af1daf573aaebb84d943292b1e30164f6317a10c0d156107c72afd5: Status 404 returned error can't find the container with id 3f20520f2af1daf573aaebb84d943292b1e30164f6317a10c0d156107c72afd5 Apr 17 16:44:48.916688 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:48.916648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" event={"ID":"d6b218a1-63ce-4c38-9d4a-b136556f7052","Type":"ContainerStarted","Data":"e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b"} Apr 17 16:44:48.917073 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:48.916697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" event={"ID":"d6b218a1-63ce-4c38-9d4a-b136556f7052","Type":"ContainerStarted","Data":"3f20520f2af1daf573aaebb84d943292b1e30164f6317a10c0d156107c72afd5"} Apr 17 16:44:52.932883 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:52.932846 2572 generic.go:358] "Generic (PLEG): container finished" podID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerID="e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b" exitCode=0 Apr 17 16:44:52.933288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:44:52.932925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" event={"ID":"d6b218a1-63ce-4c38-9d4a-b136556f7052","Type":"ContainerDied","Data":"e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b"} Apr 17 16:45:03.277450 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.277413 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd"] Apr 17 16:45:03.277882 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.277830 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerName="main" containerID="cri-o://bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939" gracePeriod=30 Apr 17 16:45:03.591211 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.591185 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:45:03.751382 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751337 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-tls-certs\") pod \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " Apr 17 16:45:03.751570 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751400 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ptwt\" (UniqueName: \"kubernetes.io/projected/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kube-api-access-8ptwt\") pod \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " Apr 17 16:45:03.751570 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751426 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-home\") pod \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " Apr 17 16:45:03.751570 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751449 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kserve-provision-location\") pod \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " Apr 17 16:45:03.751769 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-model-cache\") pod \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " Apr 17 16:45:03.751769 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751689 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-dshm\") pod \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\" (UID: \"5fe646b3-2c37-4cb3-8359-1188d47a5d9b\") " Apr 17 16:45:03.751769 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751752 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-home" (OuterVolumeSpecName: "home") pod "5fe646b3-2c37-4cb3-8359-1188d47a5d9b" (UID: "5fe646b3-2c37-4cb3-8359-1188d47a5d9b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:03.751939 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.751911 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-model-cache" (OuterVolumeSpecName: "model-cache") pod "5fe646b3-2c37-4cb3-8359-1188d47a5d9b" (UID: "5fe646b3-2c37-4cb3-8359-1188d47a5d9b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:03.752132 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.752026 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:03.752132 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.752050 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:03.753753 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.753700 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5fe646b3-2c37-4cb3-8359-1188d47a5d9b" (UID: "5fe646b3-2c37-4cb3-8359-1188d47a5d9b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:03.753847 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.753790 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kube-api-access-8ptwt" (OuterVolumeSpecName: "kube-api-access-8ptwt") pod "5fe646b3-2c37-4cb3-8359-1188d47a5d9b" (UID: "5fe646b3-2c37-4cb3-8359-1188d47a5d9b"). InnerVolumeSpecName "kube-api-access-8ptwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:03.753847 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.753790 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-dshm" (OuterVolumeSpecName: "dshm") pod "5fe646b3-2c37-4cb3-8359-1188d47a5d9b" (UID: "5fe646b3-2c37-4cb3-8359-1188d47a5d9b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:03.811979 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.811937 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5fe646b3-2c37-4cb3-8359-1188d47a5d9b" (UID: "5fe646b3-2c37-4cb3-8359-1188d47a5d9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:03.852472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.852447 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:03.852472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.852473 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ptwt\" (UniqueName: \"kubernetes.io/projected/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kube-api-access-8ptwt\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:03.852655 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.852485 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:03.852655 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.852493 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5fe646b3-2c37-4cb3-8359-1188d47a5d9b-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:03.991710 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.991683 2572 generic.go:358] "Generic (PLEG): container finished" podID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerID="bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939" exitCode=0 Apr 17 16:45:03.991861 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.991766 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" Apr 17 16:45:03.991861 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.991803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" event={"ID":"5fe646b3-2c37-4cb3-8359-1188d47a5d9b","Type":"ContainerDied","Data":"bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939"} Apr 17 16:45:03.991861 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.991840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd" event={"ID":"5fe646b3-2c37-4cb3-8359-1188d47a5d9b","Type":"ContainerDied","Data":"f773b81cc032dc6b1cd269774cca18e45fd0e79f3396072f87a700fb2f5b71e4"} Apr 17 16:45:03.991861 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:03.991855 2572 scope.go:117] "RemoveContainer" containerID="bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939" Apr 17 16:45:04.000695 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.000675 2572 scope.go:117] "RemoveContainer" containerID="ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8" Apr 17 16:45:04.010868 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.010847 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd"] Apr 17 16:45:04.014204 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.014183 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-74cfd7d874-l5sgd"] Apr 17 16:45:04.062472 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.062450 2572 scope.go:117] "RemoveContainer" containerID="bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939" Apr 17 16:45:04.062836 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:45:04.062807 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939\": container with ID starting with bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939 not found: ID does not exist" containerID="bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939" Apr 17 16:45:04.062929 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.062836 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939"} err="failed to get container status \"bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939\": rpc error: code = NotFound desc = could not find container \"bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939\": container with ID starting with bc5e5ed32d9a225ca2897087cf0638c747bcf08cbdecf71bb826bd74c1e1e939 not found: ID does not exist" Apr 17 16:45:04.062929 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.062855 2572 scope.go:117] "RemoveContainer" containerID="ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8" Apr 17 16:45:04.063222 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:45:04.063189 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8\": container with ID starting with ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8 not found: ID does not exist" containerID="ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8" Apr 17 16:45:04.063324 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:04.063220 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8"} err="failed to get container status \"ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8\": rpc error: code = NotFound desc = could not find container \"ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8\": container with ID starting with ff57b86613dd8b3397ee5cbd762e9a523647ad380a28821ec05969266e10a1c8 not found: ID does not exist" Apr 17 16:45:05.974375 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:05.974343 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" path="/var/lib/kubelet/pods/5fe646b3-2c37-4cb3-8359-1188d47a5d9b/volumes" Apr 17 16:45:09.566474 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.566428 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6"] Apr 17 16:45:09.567073 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.567047 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerName="main" Apr 17 16:45:09.567073 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.567064 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerName="main" Apr 17 16:45:09.567182 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.567094 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerName="storage-initializer" Apr 17 16:45:09.567182 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.567104 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerName="storage-initializer" Apr 17 16:45:09.567286 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.567198 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fe646b3-2c37-4cb3-8359-1188d47a5d9b" containerName="main" Apr 17 16:45:09.602059 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.601882 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6"] Apr 17 16:45:09.602059 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.602028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.604622 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.604595 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 17 16:45:09.607188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.606788 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-home\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.607188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.606826 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-model-cache\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.607188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.606868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.607188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.606928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-dshm\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.607188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.606952 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-tls-certs\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.607188 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.607015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lv74\" (UniqueName: \"kubernetes.io/projected/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kube-api-access-5lv74\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708544 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708713 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-dshm\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708713 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-tls-certs\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708860 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708770 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lv74\" (UniqueName: \"kubernetes.io/projected/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kube-api-access-5lv74\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708903 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-home\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708946 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-model-cache\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.708946 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.708920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.709183 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.709156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-home\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.709183 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.709175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-model-cache\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.711199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.711171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-dshm\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.711596 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.711571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-tls-certs\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.716766 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.716703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lv74\" (UniqueName: \"kubernetes.io/projected/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kube-api-access-5lv74\") pod \"precise-prefix-cache-test-kserve-7466444c7d-kt6p6\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:09.915554 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:09.915523 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:12.373871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:12.373739 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6"] Apr 17 16:45:12.376785 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:45:12.376747 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a46c102_80ee_41cf_8a52_8cb2a81ff5fc.slice/crio-3d5e2058d9f500423576e71a1aca3005074a3af6cab9bc9d72ddd3e3ceda45ba WatchSource:0}: Error finding container 3d5e2058d9f500423576e71a1aca3005074a3af6cab9bc9d72ddd3e3ceda45ba: Status 404 returned error can't find the container with id 3d5e2058d9f500423576e71a1aca3005074a3af6cab9bc9d72ddd3e3ceda45ba Apr 17 16:45:13.036042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:13.035982 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" event={"ID":"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc","Type":"ContainerStarted","Data":"493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0"} Apr 17 16:45:13.036042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:13.036046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" event={"ID":"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc","Type":"ContainerStarted","Data":"3d5e2058d9f500423576e71a1aca3005074a3af6cab9bc9d72ddd3e3ceda45ba"} Apr 17 16:45:17.053923 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:17.053890 2572 generic.go:358] "Generic (PLEG): container finished" podID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerID="493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0" exitCode=0 Apr 17 16:45:17.054347 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:17.053975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" event={"ID":"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc","Type":"ContainerDied","Data":"493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0"} Apr 17 16:45:18.059969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:18.059932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" event={"ID":"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc","Type":"ContainerStarted","Data":"5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43"} Apr 17 16:45:18.082711 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:18.082645 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" podStartSLOduration=9.082625336 podStartE2EDuration="9.082625336s" podCreationTimestamp="2026-04-17 16:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:45:18.079815298 +0000 UTC m=+840.667404267" watchObservedRunningTime="2026-04-17 16:45:18.082625336 +0000 UTC m=+840.670214307" Apr 17 16:45:19.916035 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:19.915981 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:19.916575 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:19.916083 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:19.931399 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:19.931377 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:20.079074 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:20.079017 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:37.144440 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:37.144405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" event={"ID":"d6b218a1-63ce-4c38-9d4a-b136556f7052","Type":"ContainerStarted","Data":"cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586"} Apr 17 16:45:37.165135 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:37.165092 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podStartSLOduration=6.805615339 podStartE2EDuration="50.165079413s" podCreationTimestamp="2026-04-17 16:44:47 +0000 UTC" firstStartedPulling="2026-04-17 16:44:52.93406466 +0000 UTC m=+815.521653608" lastFinishedPulling="2026-04-17 16:45:36.293528734 +0000 UTC m=+858.881117682" observedRunningTime="2026-04-17 16:45:37.163239689 +0000 UTC m=+859.750828668" watchObservedRunningTime="2026-04-17 16:45:37.165079413 +0000 UTC m=+859.752668382" Apr 17 16:45:38.023205 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:38.023174 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:45:38.023205 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:38.023210 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:45:38.024692 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:38.024665 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:45:42.306873 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.306832 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6"] Apr 17 16:45:42.307334 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.307177 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerName="main" containerID="cri-o://5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43" gracePeriod=30 Apr 17 16:45:42.569296 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.569240 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:42.715849 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.715816 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-tls-certs\") pod \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " Apr 17 16:45:42.716044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.715869 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-home\") pod \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " Apr 17 16:45:42.716044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.715903 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-model-cache\") pod \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " Apr 17 16:45:42.716044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.715952 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-dshm\") pod \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " Apr 17 16:45:42.716044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.715979 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lv74\" (UniqueName: \"kubernetes.io/projected/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kube-api-access-5lv74\") pod \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " Apr 17 16:45:42.716044 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.715999 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kserve-provision-location\") pod \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\" (UID: \"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc\") " Apr 17 16:45:42.716337 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.716200 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-model-cache" (OuterVolumeSpecName: "model-cache") pod "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" (UID: "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.716337 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.716200 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-home" (OuterVolumeSpecName: "home") pod "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" (UID: "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.718168 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.718136 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kube-api-access-5lv74" (OuterVolumeSpecName: "kube-api-access-5lv74") pod "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" (UID: "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc"). InnerVolumeSpecName "kube-api-access-5lv74". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:42.718290 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.718223 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" (UID: "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:42.718520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.718487 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-dshm" (OuterVolumeSpecName: "dshm") pod "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" (UID: "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.770570 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.770534 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" (UID: "8a46c102-80ee-41cf-8a52-8cb2a81ff5fc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.817358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.817324 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.817503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.817362 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.817503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.817380 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.817503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.817395 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lv74\" (UniqueName: \"kubernetes.io/projected/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kube-api-access-5lv74\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.817503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.817409 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.817503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:42.817425 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:45:43.169400 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.169366 2572 generic.go:358] "Generic (PLEG): container finished" podID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerID="5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43" exitCode=0 Apr 17 16:45:43.169566 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.169443 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" Apr 17 16:45:43.169566 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.169454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" event={"ID":"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc","Type":"ContainerDied","Data":"5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43"} Apr 17 16:45:43.169566 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.169494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6" event={"ID":"8a46c102-80ee-41cf-8a52-8cb2a81ff5fc","Type":"ContainerDied","Data":"3d5e2058d9f500423576e71a1aca3005074a3af6cab9bc9d72ddd3e3ceda45ba"} Apr 17 16:45:43.169566 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.169510 2572 scope.go:117] "RemoveContainer" containerID="5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43" Apr 17 16:45:43.179230 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.179213 2572 scope.go:117] "RemoveContainer" containerID="493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0" Apr 17 16:45:43.191285 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.191258 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6"] Apr 17 16:45:43.193527 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.193508 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-kt6p6"] Apr 17 16:45:43.240003 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.239973 2572 scope.go:117] "RemoveContainer" containerID="5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43" Apr 17 16:45:43.240318 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:45:43.240296 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43\": container with ID starting with 5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43 not found: ID does not exist" containerID="5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43" Apr 17 16:45:43.240418 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.240329 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43"} err="failed to get container status \"5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43\": rpc error: code = NotFound desc = could not find container \"5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43\": container with ID starting with 5118d01a61f817f79445ed4fb9b566153d8f1a9776ac0fb0719404b808ae6c43 not found: ID does not exist" Apr 17 16:45:43.240418 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.240353 2572 scope.go:117] "RemoveContainer" containerID="493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0" Apr 17 16:45:43.240661 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:45:43.240643 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0\": container with ID starting with 493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0 not found: ID does not exist" containerID="493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0" Apr 17 16:45:43.240749 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.240667 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0"} err="failed to get container status \"493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0\": rpc error: code = NotFound desc = could not find container \"493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0\": container with ID starting with 493ed7d84909d46bd88424828f2f842bb82573107c6f9a3b66f15992b97967d0 not found: ID does not exist" Apr 17 16:45:43.973988 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:43.973958 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" path="/var/lib/kubelet/pods/8a46c102-80ee-41cf-8a52-8cb2a81ff5fc/volumes" Apr 17 16:45:48.023593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:48.023524 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:45:58.023550 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:45:58.023504 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:46:00.954951 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:00.954909 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd"] Apr 17 16:46:00.955466 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:00.955446 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerName="storage-initializer" Apr 17 16:46:00.955515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:00.955471 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerName="storage-initializer" Apr 17 16:46:00.955515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:00.955484 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerName="main" Apr 17 16:46:00.955515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:00.955493 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerName="main" Apr 17 16:46:00.955615 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:00.955603 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a46c102-80ee-41cf-8a52-8cb2a81ff5fc" containerName="main" Apr 17 16:46:01.384005 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.383966 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd"] Apr 17 16:46:01.384203 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.384116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.386494 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.386469 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 16:46:01.480089 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.480055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7d4\" (UniqueName: \"kubernetes.io/projected/40524dd1-3f23-4605-a734-add4726ddd49-kube-api-access-px7d4\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.480225 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.480151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-home\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.480225 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.480177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-model-cache\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.480374 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.480270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-dshm\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.480374 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.480317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-kserve-provision-location\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.480516 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.480492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.581523 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-home\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.581523 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-model-cache\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.581815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-dshm\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.581815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-kserve-provision-location\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.581815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.581815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px7d4\" (UniqueName: \"kubernetes.io/projected/40524dd1-3f23-4605-a734-add4726ddd49-kube-api-access-px7d4\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.582032 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581928 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-home\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.582032 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.581974 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-model-cache\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.582032 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.582013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-kserve-provision-location\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.584056 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.584031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-dshm\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.584362 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.584344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.589409 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.589386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7d4\" (UniqueName: \"kubernetes.io/projected/40524dd1-3f23-4605-a734-add4726ddd49-kube-api-access-px7d4\") pod \"stop-feature-test-kserve-5455dfc465-gttnd\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.726833 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.726747 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:01.857139 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:46:01.857108 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40524dd1_3f23_4605_a734_add4726ddd49.slice/crio-c47616fb965db440f7109b484d877bc69d496a6319fd7d61f00019776d85f74b WatchSource:0}: Error finding container c47616fb965db440f7109b484d877bc69d496a6319fd7d61f00019776d85f74b: Status 404 returned error can't find the container with id c47616fb965db440f7109b484d877bc69d496a6319fd7d61f00019776d85f74b Apr 17 16:46:01.857259 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:01.857183 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd"] Apr 17 16:46:02.249218 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:02.249126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" event={"ID":"40524dd1-3f23-4605-a734-add4726ddd49","Type":"ContainerStarted","Data":"12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23"} Apr 17 16:46:02.249218 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:02.249170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" event={"ID":"40524dd1-3f23-4605-a734-add4726ddd49","Type":"ContainerStarted","Data":"c47616fb965db440f7109b484d877bc69d496a6319fd7d61f00019776d85f74b"} Apr 17 16:46:07.272533 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:07.272499 2572 generic.go:358] "Generic (PLEG): container finished" podID="40524dd1-3f23-4605-a734-add4726ddd49" containerID="12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23" exitCode=0 Apr 17 16:46:07.273059 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:07.272576 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" event={"ID":"40524dd1-3f23-4605-a734-add4726ddd49","Type":"ContainerDied","Data":"12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23"} Apr 17 16:46:08.023621 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:08.023578 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:46:08.279115 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:08.279028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" event={"ID":"40524dd1-3f23-4605-a734-add4726ddd49","Type":"ContainerStarted","Data":"756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7"} Apr 17 16:46:08.307655 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:08.307596 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podStartSLOduration=8.307576916 podStartE2EDuration="8.307576916s" podCreationTimestamp="2026-04-17 16:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:46:08.305310103 +0000 UTC m=+890.892899074" watchObservedRunningTime="2026-04-17 16:46:08.307576916 +0000 UTC m=+890.895165887" Apr 17 16:46:11.726985 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:11.726929 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:11.726985 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:11.726984 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:46:11.728451 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:11.728420 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:46:17.904691 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:17.904658 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:46:17.908015 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:17.907990 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:46:17.910554 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:17.910532 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:46:17.913804 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:17.913788 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:46:18.023828 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:18.023779 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:46:21.728047 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:21.727998 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:46:28.023574 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:28.023519 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:46:31.727868 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:31.727817 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:46:38.024071 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:38.024022 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:46:41.728159 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:41.728115 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:46:48.024133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:48.024083 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:46:51.727264 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:51.727212 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:46:58.023359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:46:58.023317 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:47:01.727773 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:01.727701 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:47:08.023557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:08.023506 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 17 16:47:11.727848 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:11.727792 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:47:18.033564 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:18.033529 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:47:18.042373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:18.042346 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:47:21.727926 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:21.727875 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:47:26.965362 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:26.965326 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj"] Apr 17 16:47:26.965886 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:26.965589 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" containerID="cri-o://cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586" gracePeriod=30 Apr 17 16:47:31.727254 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:31.727207 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:47:41.727639 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:41.727537 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8000/health\": dial tcp 10.133.0.49:8000: connect: connection refused" Apr 17 16:47:51.737736 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:51.737683 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:47:51.745413 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:51.745390 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:47:52.749559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:52.749515 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd"] Apr 17 16:47:52.822006 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:52.821970 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:52.822192 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:52.822037 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs podName:40524dd1-3f23-4605-a734-add4726ddd49 nodeName:}" failed. No retries permitted until 2026-04-17 16:47:53.322020435 +0000 UTC m=+995.909609383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs") pod "stop-feature-test-kserve-5455dfc465-gttnd" (UID: "40524dd1-3f23-4605-a734-add4726ddd49") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:53.326041 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:53.326003 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:53.326240 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:53.326088 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs podName:40524dd1-3f23-4605-a734-add4726ddd49 nodeName:}" failed. No retries permitted until 2026-04-17 16:47:54.326068225 +0000 UTC m=+996.913657173 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs") pod "stop-feature-test-kserve-5455dfc465-gttnd" (UID: "40524dd1-3f23-4605-a734-add4726ddd49") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:53.722226 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:53.722158 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" containerID="cri-o://756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7" gracePeriod=30 Apr 17 16:47:54.334537 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:54.334502 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:54.334937 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:54.334581 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs podName:40524dd1-3f23-4605-a734-add4726ddd49 nodeName:}" failed. No retries permitted until 2026-04-17 16:47:56.334566255 +0000 UTC m=+998.922155213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs") pod "stop-feature-test-kserve-5455dfc465-gttnd" (UID: "40524dd1-3f23-4605-a734-add4726ddd49") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:56.354901 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:56.354869 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:56.355268 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:56.354940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs podName:40524dd1-3f23-4605-a734-add4726ddd49 nodeName:}" failed. No retries permitted until 2026-04-17 16:48:00.354927349 +0000 UTC m=+1002.942516297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs") pod "stop-feature-test-kserve-5455dfc465-gttnd" (UID: "40524dd1-3f23-4605-a734-add4726ddd49") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:47:57.212257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.212231 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj_d6b218a1-63ce-4c38-9d4a-b136556f7052/main/0.log" Apr 17 16:47:57.212646 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.212631 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:47:57.362282 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362250 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-kserve-provision-location\") pod \"d6b218a1-63ce-4c38-9d4a-b136556f7052\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " Apr 17 16:47:57.362707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362315 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnpkj\" (UniqueName: \"kubernetes.io/projected/d6b218a1-63ce-4c38-9d4a-b136556f7052-kube-api-access-nnpkj\") pod \"d6b218a1-63ce-4c38-9d4a-b136556f7052\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " Apr 17 16:47:57.362707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362360 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-model-cache\") pod \"d6b218a1-63ce-4c38-9d4a-b136556f7052\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " Apr 17 16:47:57.362707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362384 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-dshm\") pod \"d6b218a1-63ce-4c38-9d4a-b136556f7052\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " Apr 17 16:47:57.362707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362412 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-home\") pod \"d6b218a1-63ce-4c38-9d4a-b136556f7052\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " Apr 17 16:47:57.362707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362445 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b218a1-63ce-4c38-9d4a-b136556f7052-tls-certs\") pod \"d6b218a1-63ce-4c38-9d4a-b136556f7052\" (UID: \"d6b218a1-63ce-4c38-9d4a-b136556f7052\") " Apr 17 16:47:57.362707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362658 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-model-cache" (OuterVolumeSpecName: "model-cache") pod "d6b218a1-63ce-4c38-9d4a-b136556f7052" (UID: "d6b218a1-63ce-4c38-9d4a-b136556f7052"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:57.363051 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362789 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:47:57.363051 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.362870 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-home" (OuterVolumeSpecName: "home") pod "d6b218a1-63ce-4c38-9d4a-b136556f7052" (UID: "d6b218a1-63ce-4c38-9d4a-b136556f7052"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:57.364830 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.364792 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b218a1-63ce-4c38-9d4a-b136556f7052-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d6b218a1-63ce-4c38-9d4a-b136556f7052" (UID: "d6b218a1-63ce-4c38-9d4a-b136556f7052"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:47:57.364953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.364879 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-dshm" (OuterVolumeSpecName: "dshm") pod "d6b218a1-63ce-4c38-9d4a-b136556f7052" (UID: "d6b218a1-63ce-4c38-9d4a-b136556f7052"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:57.364953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.364904 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b218a1-63ce-4c38-9d4a-b136556f7052-kube-api-access-nnpkj" (OuterVolumeSpecName: "kube-api-access-nnpkj") pod "d6b218a1-63ce-4c38-9d4a-b136556f7052" (UID: "d6b218a1-63ce-4c38-9d4a-b136556f7052"). InnerVolumeSpecName "kube-api-access-nnpkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:47:57.414093 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.414060 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d6b218a1-63ce-4c38-9d4a-b136556f7052" (UID: "d6b218a1-63ce-4c38-9d4a-b136556f7052"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:57.463342 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.463316 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnpkj\" (UniqueName: \"kubernetes.io/projected/d6b218a1-63ce-4c38-9d4a-b136556f7052-kube-api-access-nnpkj\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:47:57.463342 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.463338 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:47:57.463476 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.463347 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:47:57.463476 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.463355 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b218a1-63ce-4c38-9d4a-b136556f7052-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:47:57.463476 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.463364 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6b218a1-63ce-4c38-9d4a-b136556f7052-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:47:57.738746 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.738645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj_d6b218a1-63ce-4c38-9d4a-b136556f7052/main/0.log" Apr 17 16:47:57.739040 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.739018 2572 generic.go:358] "Generic (PLEG): container finished" podID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerID="cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586" exitCode=137 Apr 17 16:47:57.739103 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.739083 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" event={"ID":"d6b218a1-63ce-4c38-9d4a-b136556f7052","Type":"ContainerDied","Data":"cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586"} Apr 17 16:47:57.739147 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.739122 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" event={"ID":"d6b218a1-63ce-4c38-9d4a-b136556f7052","Type":"ContainerDied","Data":"3f20520f2af1daf573aaebb84d943292b1e30164f6317a10c0d156107c72afd5"} Apr 17 16:47:57.739147 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.739141 2572 scope.go:117] "RemoveContainer" containerID="cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586" Apr 17 16:47:57.739213 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.739092 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj" Apr 17 16:47:57.757842 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.757801 2572 scope.go:117] "RemoveContainer" containerID="e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b" Apr 17 16:47:57.761600 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.761576 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj"] Apr 17 16:47:57.766435 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.766410 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7c67d7cdfft55zj"] Apr 17 16:47:57.768502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.768469 2572 scope.go:117] "RemoveContainer" containerID="cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586" Apr 17 16:47:57.768803 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:57.768784 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586\": container with ID starting with cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586 not found: ID does not exist" containerID="cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586" Apr 17 16:47:57.768864 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.768812 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586"} err="failed to get container status \"cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586\": rpc error: code = NotFound desc = could not find container \"cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586\": container with ID starting with cd753c0df81593b022fc1dc0a7c175ddb5ec5024cf0239b551b90a8501bc4586 not found: ID does not exist" Apr 17 16:47:57.768864 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.768831 2572 scope.go:117] "RemoveContainer" containerID="e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b" Apr 17 16:47:57.769101 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:47:57.769087 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b\": container with ID starting with e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b not found: ID does not exist" containerID="e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b" Apr 17 16:47:57.769148 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.769105 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b"} err="failed to get container status \"e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b\": rpc error: code = NotFound desc = could not find container \"e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b\": container with ID starting with e350a50d9ffe1b9f41b3e443d5fbe06418680a45fe7b9709b43cc48cae84177b not found: ID does not exist" Apr 17 16:47:57.973948 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:47:57.973917 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" path="/var/lib/kubelet/pods/d6b218a1-63ce-4c38-9d4a-b136556f7052/volumes" Apr 17 16:48:00.392084 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:48:00.392039 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:48:00.392485 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:48:00.392120 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs podName:40524dd1-3f23-4605-a734-add4726ddd49 nodeName:}" failed. No retries permitted until 2026-04-17 16:48:08.392104847 +0000 UTC m=+1010.979693795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs") pod "stop-feature-test-kserve-5455dfc465-gttnd" (UID: "40524dd1-3f23-4605-a734-add4726ddd49") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:48:08.458550 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:48:08.458512 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:48:08.458955 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:48:08.458588 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs podName:40524dd1-3f23-4605-a734-add4726ddd49 nodeName:}" failed. No retries permitted until 2026-04-17 16:48:24.458572275 +0000 UTC m=+1027.046161224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs") pod "stop-feature-test-kserve-5455dfc465-gttnd" (UID: "40524dd1-3f23-4605-a734-add4726ddd49") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 16:48:08.679666 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.679626 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf"] Apr 17 16:48:08.680060 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.680042 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" Apr 17 16:48:08.680166 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.680064 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" Apr 17 16:48:08.680166 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.680076 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="storage-initializer" Apr 17 16:48:08.680166 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.680082 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="storage-initializer" Apr 17 16:48:08.680166 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.680154 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6b218a1-63ce-4c38-9d4a-b136556f7052" containerName="main" Apr 17 16:48:08.685599 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.685582 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.696537 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.696513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf"] Apr 17 16:48:08.760876 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.760804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-model-cache\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.760876 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.760835 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kserve-provision-location\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.761259 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.760949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-home\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.761259 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.760976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-tls-certs\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.761259 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.760992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xrs\" (UniqueName: \"kubernetes.io/projected/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kube-api-access-s5xrs\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.761259 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.761007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-dshm\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862342 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-home\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862342 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-tls-certs\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862576 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xrs\" (UniqueName: \"kubernetes.io/projected/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kube-api-access-s5xrs\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862576 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-dshm\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862576 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-model-cache\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862576 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kserve-provision-location\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862814 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862777 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-home\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862874 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-model-cache\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.862918 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.862881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kserve-provision-location\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.864933 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.864907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-dshm\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.865029 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.864990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-tls-certs\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.870186 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.870165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xrs\" (UniqueName: \"kubernetes.io/projected/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kube-api-access-s5xrs\") pod \"stop-feature-test-kserve-5455dfc465-2dpvf\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:08.996955 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:08.996921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:09.123785 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:09.123758 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf"] Apr 17 16:48:09.125387 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:48:09.125363 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f951e6_6ffe_4e0f_b50c_a3c036bf3b3d.slice/crio-18fcc81b010ca336dcd6dd0d0ecfc512156bcd77c4a2fddd14b4fb7d27f3ca96 WatchSource:0}: Error finding container 18fcc81b010ca336dcd6dd0d0ecfc512156bcd77c4a2fddd14b4fb7d27f3ca96: Status 404 returned error can't find the container with id 18fcc81b010ca336dcd6dd0d0ecfc512156bcd77c4a2fddd14b4fb7d27f3ca96 Apr 17 16:48:09.127358 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:09.127341 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:48:09.787307 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:09.787262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" event={"ID":"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d","Type":"ContainerStarted","Data":"16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b"} Apr 17 16:48:09.787307 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:09.787306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" event={"ID":"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d","Type":"ContainerStarted","Data":"18fcc81b010ca336dcd6dd0d0ecfc512156bcd77c4a2fddd14b4fb7d27f3ca96"} Apr 17 16:48:13.808104 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:13.808067 2572 generic.go:358] "Generic (PLEG): container finished" podID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerID="16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b" exitCode=0 Apr 17 16:48:13.808526 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:13.808116 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" event={"ID":"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d","Type":"ContainerDied","Data":"16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b"} Apr 17 16:48:14.813626 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:14.813589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" event={"ID":"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d","Type":"ContainerStarted","Data":"5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4"} Apr 17 16:48:14.834627 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:14.834580 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podStartSLOduration=6.834564897 podStartE2EDuration="6.834564897s" podCreationTimestamp="2026-04-17 16:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:48:14.831984075 +0000 UTC m=+1017.419573045" watchObservedRunningTime="2026-04-17 16:48:14.834564897 +0000 UTC m=+1017.422153883" Apr 17 16:48:18.997685 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:18.997653 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:18.998207 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:18.997749 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:48:18.999155 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:18.999127 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:48:24.001620 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.001597 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5455dfc465-gttnd_40524dd1-3f23-4605-a734-add4726ddd49/main/0.log" Apr 17 16:48:24.002009 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.001979 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:48:24.109374 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109346 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7d4\" (UniqueName: \"kubernetes.io/projected/40524dd1-3f23-4605-a734-add4726ddd49-kube-api-access-px7d4\") pod \"40524dd1-3f23-4605-a734-add4726ddd49\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " Apr 17 16:48:24.109555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109396 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-dshm\") pod \"40524dd1-3f23-4605-a734-add4726ddd49\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " Apr 17 16:48:24.109555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109426 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-home\") pod \"40524dd1-3f23-4605-a734-add4726ddd49\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " Apr 17 16:48:24.109555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109477 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-kserve-provision-location\") pod \"40524dd1-3f23-4605-a734-add4726ddd49\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " Apr 17 16:48:24.109555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109510 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs\") pod \"40524dd1-3f23-4605-a734-add4726ddd49\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " Apr 17 16:48:24.109555 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109540 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-model-cache\") pod \"40524dd1-3f23-4605-a734-add4726ddd49\" (UID: \"40524dd1-3f23-4605-a734-add4726ddd49\") " Apr 17 16:48:24.109852 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109810 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-home" (OuterVolumeSpecName: "home") pod "40524dd1-3f23-4605-a734-add4726ddd49" (UID: "40524dd1-3f23-4605-a734-add4726ddd49"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:48:24.110168 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.109949 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-model-cache" (OuterVolumeSpecName: "model-cache") pod "40524dd1-3f23-4605-a734-add4726ddd49" (UID: "40524dd1-3f23-4605-a734-add4726ddd49"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:48:24.110168 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.110066 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:48:24.111573 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.111553 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40524dd1-3f23-4605-a734-add4726ddd49-kube-api-access-px7d4" (OuterVolumeSpecName: "kube-api-access-px7d4") pod "40524dd1-3f23-4605-a734-add4726ddd49" (UID: "40524dd1-3f23-4605-a734-add4726ddd49"). InnerVolumeSpecName "kube-api-access-px7d4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:48:24.112039 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.112022 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "40524dd1-3f23-4605-a734-add4726ddd49" (UID: "40524dd1-3f23-4605-a734-add4726ddd49"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:48:24.112497 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.112471 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-dshm" (OuterVolumeSpecName: "dshm") pod "40524dd1-3f23-4605-a734-add4726ddd49" (UID: "40524dd1-3f23-4605-a734-add4726ddd49"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:48:24.163713 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.163671 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40524dd1-3f23-4605-a734-add4726ddd49" (UID: "40524dd1-3f23-4605-a734-add4726ddd49"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:48:24.211434 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.211400 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:48:24.211434 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.211434 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-px7d4\" (UniqueName: \"kubernetes.io/projected/40524dd1-3f23-4605-a734-add4726ddd49-kube-api-access-px7d4\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:48:24.211651 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.211445 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:48:24.211651 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.211454 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40524dd1-3f23-4605-a734-add4726ddd49-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:48:24.211651 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.211464 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40524dd1-3f23-4605-a734-add4726ddd49-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:48:24.853431 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.853400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5455dfc465-gttnd_40524dd1-3f23-4605-a734-add4726ddd49/main/0.log" Apr 17 16:48:24.853784 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.853762 2572 generic.go:358] "Generic (PLEG): container finished" podID="40524dd1-3f23-4605-a734-add4726ddd49" containerID="756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7" exitCode=137 Apr 17 16:48:24.853865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.853806 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" event={"ID":"40524dd1-3f23-4605-a734-add4726ddd49","Type":"ContainerDied","Data":"756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7"} Apr 17 16:48:24.853865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.853828 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" event={"ID":"40524dd1-3f23-4605-a734-add4726ddd49","Type":"ContainerDied","Data":"c47616fb965db440f7109b484d877bc69d496a6319fd7d61f00019776d85f74b"} Apr 17 16:48:24.853865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.853832 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd" Apr 17 16:48:24.853865 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.853843 2572 scope.go:117] "RemoveContainer" containerID="756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7" Apr 17 16:48:24.877557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.877528 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd"] Apr 17 16:48:24.880805 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.880781 2572 scope.go:117] "RemoveContainer" containerID="12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23" Apr 17 16:48:24.880927 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.880841 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-gttnd"] Apr 17 16:48:24.946708 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.946682 2572 scope.go:117] "RemoveContainer" containerID="756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7" Apr 17 16:48:24.947097 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:48:24.947077 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7\": container with ID starting with 756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7 not found: ID does not exist" containerID="756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7" Apr 17 16:48:24.947160 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.947110 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7"} err="failed to get container status \"756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7\": rpc error: code = NotFound desc = could not find container \"756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7\": container with ID starting with 756e68bc56e8459d5d19871b588b0e7826180671f66c032ec0abd2351e77f9b7 not found: ID does not exist" Apr 17 16:48:24.947160 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.947130 2572 scope.go:117] "RemoveContainer" containerID="12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23" Apr 17 16:48:24.947466 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:48:24.947439 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23\": container with ID starting with 12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23 not found: ID does not exist" containerID="12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23" Apr 17 16:48:24.947546 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:24.947475 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23"} err="failed to get container status \"12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23\": rpc error: code = NotFound desc = could not find container \"12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23\": container with ID starting with 12292db98fd44b9b8332ea536bc3016912202a0b59180ed0a30968d202099f23 not found: ID does not exist" Apr 17 16:48:25.974490 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:25.974454 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40524dd1-3f23-4605-a734-add4726ddd49" path="/var/lib/kubelet/pods/40524dd1-3f23-4605-a734-add4726ddd49/volumes" Apr 17 16:48:28.998140 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:28.998092 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:48:38.997573 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:38.997527 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:48:48.997678 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:48.997630 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:48:58.997363 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:48:58.997322 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:49:08.997405 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:08.997319 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:49:18.997709 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:18.997658 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:49:28.998113 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:28.998065 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:49:38.998317 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:38.998268 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 17 16:49:39.674833 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.674799 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5"] Apr 17 16:49:39.675190 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.675175 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="storage-initializer" Apr 17 16:49:39.675190 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.675190 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="storage-initializer" Apr 17 16:49:39.675288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.675215 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" Apr 17 16:49:39.675288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.675224 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" Apr 17 16:49:39.675354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.675301 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="40524dd1-3f23-4605-a734-add4726ddd49" containerName="main" Apr 17 16:49:39.678370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.678350 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.680676 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.680654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 17 16:49:39.689712 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.689686 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5"] Apr 17 16:49:39.780219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.780185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kserve-provision-location\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.780373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.780233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b12800-f4b7-48f4-865a-a6f41ca430a3-tls-certs\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.780373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.780287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sww52\" (UniqueName: \"kubernetes.io/projected/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kube-api-access-sww52\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.780373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.780331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-dshm\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.780373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.780357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-model-cache\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.780553 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.780377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-home\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.881433 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b12800-f4b7-48f4-865a-a6f41ca430a3-tls-certs\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.881593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sww52\" (UniqueName: \"kubernetes.io/projected/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kube-api-access-sww52\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.881593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-dshm\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.881593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-model-cache\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.881593 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-home\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.881835 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kserve-provision-location\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.882012 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.881980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-model-cache\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.882012 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.882001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-home\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.882361 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.882052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kserve-provision-location\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.883932 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.883910 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-dshm\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.884072 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.884054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b12800-f4b7-48f4-865a-a6f41ca430a3-tls-certs\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.889136 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.889117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sww52\" (UniqueName: \"kubernetes.io/projected/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kube-api-access-sww52\") pod \"router-with-refs-test-kserve-654699475c-bzbq5\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:39.990290 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:39.990225 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:40.158847 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:40.158033 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5"] Apr 17 16:49:40.167172 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:40.167101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" event={"ID":"b7b12800-f4b7-48f4-865a-a6f41ca430a3","Type":"ContainerStarted","Data":"57916f482bbb3ff62e3ae445896734930f39e40912e42b0e871fca79aac374d7"} Apr 17 16:49:41.174261 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:41.174216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" event={"ID":"b7b12800-f4b7-48f4-865a-a6f41ca430a3","Type":"ContainerStarted","Data":"f889ad793608a8728a01fa2baf2eeb1df9d62055b3af94cdd47e81c7a036d819"} Apr 17 16:49:45.191366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:45.191330 2572 generic.go:358] "Generic (PLEG): container finished" podID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerID="f889ad793608a8728a01fa2baf2eeb1df9d62055b3af94cdd47e81c7a036d819" exitCode=0 Apr 17 16:49:45.191775 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:45.191384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" event={"ID":"b7b12800-f4b7-48f4-865a-a6f41ca430a3","Type":"ContainerDied","Data":"f889ad793608a8728a01fa2baf2eeb1df9d62055b3af94cdd47e81c7a036d819"} Apr 17 16:49:46.196757 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:46.196701 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" event={"ID":"b7b12800-f4b7-48f4-865a-a6f41ca430a3","Type":"ContainerStarted","Data":"3d00e78f83f57fee772491612eb3fa90c94450417329e3598cb21dcaeb71c0a9"} Apr 17 16:49:46.218887 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:46.218837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podStartSLOduration=7.218823716 podStartE2EDuration="7.218823716s" podCreationTimestamp="2026-04-17 16:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:49:46.215911733 +0000 UTC m=+1108.803500703" watchObservedRunningTime="2026-04-17 16:49:46.218823716 +0000 UTC m=+1108.806412685" Apr 17 16:49:49.007442 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:49.007406 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:49:49.019969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:49.019943 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:49:49.991225 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:49.991189 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:49.991426 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:49.991281 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:49:49.992846 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:49.992820 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:49:59.991030 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:49:59.990983 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:50:06.457258 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:06.457223 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf"] Apr 17 16:50:06.457694 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:06.457670 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" containerID="cri-o://5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4" gracePeriod=30 Apr 17 16:50:09.990739 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:09.990679 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:50:19.991218 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:19.991172 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:50:29.991417 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:29.991373 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:50:36.696735 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.696694 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5455dfc465-2dpvf_76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d/main/0.log" Apr 17 16:50:36.697149 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.697134 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:50:36.793206 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793123 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-tls-certs\") pod \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " Apr 17 16:50:36.793206 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793188 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-home\") pod \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " Apr 17 16:50:36.793412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793255 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kserve-provision-location\") pod \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " Apr 17 16:50:36.793412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793289 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-dshm\") pod \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " Apr 17 16:50:36.793412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793330 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5xrs\" (UniqueName: \"kubernetes.io/projected/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kube-api-access-s5xrs\") pod \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " Apr 17 16:50:36.793412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793370 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-model-cache\") pod \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\" (UID: \"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d\") " Apr 17 16:50:36.793617 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793586 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-home" (OuterVolumeSpecName: "home") pod "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" (UID: "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:36.793693 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.793670 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-model-cache" (OuterVolumeSpecName: "model-cache") pod "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" (UID: "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:36.795664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.795631 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-dshm" (OuterVolumeSpecName: "dshm") pod "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" (UID: "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:36.795664 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.795635 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kube-api-access-s5xrs" (OuterVolumeSpecName: "kube-api-access-s5xrs") pod "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" (UID: "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d"). InnerVolumeSpecName "kube-api-access-s5xrs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:50:36.795836 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.795766 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" (UID: "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:50:36.851679 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.851637 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" (UID: "76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:36.894300 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.894260 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:50:36.894300 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.894298 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:50:36.894455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.894316 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5xrs\" (UniqueName: \"kubernetes.io/projected/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-kube-api-access-s5xrs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:50:36.894455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.894332 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:50:36.894455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.894347 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:50:36.894455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:36.894359 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:50:37.414772 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.414743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5455dfc465-2dpvf_76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d/main/0.log" Apr 17 16:50:37.415126 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.415100 2572 generic.go:358] "Generic (PLEG): container finished" podID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerID="5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4" exitCode=137 Apr 17 16:50:37.415243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.415166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" event={"ID":"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d","Type":"ContainerDied","Data":"5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4"} Apr 17 16:50:37.415243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.415181 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" Apr 17 16:50:37.415243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.415201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf" event={"ID":"76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d","Type":"ContainerDied","Data":"18fcc81b010ca336dcd6dd0d0ecfc512156bcd77c4a2fddd14b4fb7d27f3ca96"} Apr 17 16:50:37.415243 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.415221 2572 scope.go:117] "RemoveContainer" containerID="5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4" Apr 17 16:50:37.435103 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.435082 2572 scope.go:117] "RemoveContainer" containerID="16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b" Apr 17 16:50:37.439158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.439136 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf"] Apr 17 16:50:37.443374 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.443347 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5455dfc465-2dpvf"] Apr 17 16:50:37.502507 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.502314 2572 scope.go:117] "RemoveContainer" containerID="5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4" Apr 17 16:50:37.502762 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:50:37.502730 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4\": container with ID starting with 5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4 not found: ID does not exist" containerID="5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4" Apr 17 16:50:37.502840 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.502771 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4"} err="failed to get container status \"5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4\": rpc error: code = NotFound desc = could not find container \"5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4\": container with ID starting with 5ce2127b30b5d364cceee91810f2148e94413bc76fe12535251c49ccb73f66e4 not found: ID does not exist" Apr 17 16:50:37.502840 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.502797 2572 scope.go:117] "RemoveContainer" containerID="16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b" Apr 17 16:50:37.503101 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:50:37.503084 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b\": container with ID starting with 16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b not found: ID does not exist" containerID="16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b" Apr 17 16:50:37.503156 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.503106 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b"} err="failed to get container status \"16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b\": rpc error: code = NotFound desc = could not find container \"16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b\": container with ID starting with 16c11c784f6b6e3ec859dbbe2613b9619b4922dddc09361a6ef7f7232c4d410b not found: ID does not exist" Apr 17 16:50:37.974614 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:37.974575 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" path="/var/lib/kubelet/pods/76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d/volumes" Apr 17 16:50:39.990969 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:39.990919 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:50:49.991124 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:49.991080 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:50:59.991121 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:50:59.991069 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:51:09.991349 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:09.991307 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 17 16:51:17.938444 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:17.938417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:51:17.943373 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:17.943352 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:51:17.944380 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:17.944362 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:51:17.949149 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:17.949130 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:51:20.002321 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:20.002291 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:51:20.010204 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:20.010174 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:51:34.575840 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:34.575807 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5"] Apr 17 16:51:34.576261 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:34.576204 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" containerID="cri-o://3d00e78f83f57fee772491612eb3fa90c94450417329e3598cb21dcaeb71c0a9" gracePeriod=30 Apr 17 16:51:56.204288 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.204243 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg"] Apr 17 16:51:56.204704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.204656 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="storage-initializer" Apr 17 16:51:56.204704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.204669 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="storage-initializer" Apr 17 16:51:56.204704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.204687 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" Apr 17 16:51:56.204704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.204693 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" Apr 17 16:51:56.204891 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.204779 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="76f951e6-6ffe-4e0f-b50c-a3c036bf3b3d" containerName="main" Apr 17 16:51:56.207030 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.207008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.210052 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.210030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 16:51:56.221515 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.221485 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg"] Apr 17 16:51:56.383395 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.383361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.383562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.383408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.383562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.383479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8tmn\" (UniqueName: \"kubernetes.io/projected/e5526f92-5731-4f5e-bdbe-860b29954df2-kube-api-access-q8tmn\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.383562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.383547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5526f92-5731-4f5e-bdbe-860b29954df2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.383667 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.383584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.383667 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.383603 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.484779 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.484673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.484779 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.484741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.484779 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.484776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8tmn\" (UniqueName: \"kubernetes.io/projected/e5526f92-5731-4f5e-bdbe-860b29954df2-kube-api-access-q8tmn\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.485042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.484826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5526f92-5731-4f5e-bdbe-860b29954df2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.485042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.484862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.485042 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.484890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.485235 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.485131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.485235 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.485184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.485329 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.485250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.487137 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.487117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.487424 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.487404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5526f92-5731-4f5e-bdbe-860b29954df2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.497476 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.497444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8tmn\" (UniqueName: \"kubernetes.io/projected/e5526f92-5731-4f5e-bdbe-860b29954df2-kube-api-access-q8tmn\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.519456 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.519432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:51:56.653344 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.653306 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg"] Apr 17 16:51:56.654583 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:51:56.654552 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5526f92_5731_4f5e_bdbe_860b29954df2.slice/crio-578041ce6e2f0c384780c8c1eaccf501ac435fcf62fd24bd7fcaf6187e8c962e WatchSource:0}: Error finding container 578041ce6e2f0c384780c8c1eaccf501ac435fcf62fd24bd7fcaf6187e8c962e: Status 404 returned error can't find the container with id 578041ce6e2f0c384780c8c1eaccf501ac435fcf62fd24bd7fcaf6187e8c962e Apr 17 16:51:56.728056 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.728024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" event={"ID":"e5526f92-5731-4f5e-bdbe-860b29954df2","Type":"ContainerStarted","Data":"7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3"} Apr 17 16:51:56.728158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:51:56.728066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" event={"ID":"e5526f92-5731-4f5e-bdbe-860b29954df2","Type":"ContainerStarted","Data":"578041ce6e2f0c384780c8c1eaccf501ac435fcf62fd24bd7fcaf6187e8c962e"} Apr 17 16:52:00.747634 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:00.747599 2572 generic.go:358] "Generic (PLEG): container finished" podID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerID="7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3" exitCode=0 Apr 17 16:52:00.748055 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:00.747674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" event={"ID":"e5526f92-5731-4f5e-bdbe-860b29954df2","Type":"ContainerDied","Data":"7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3"} Apr 17 16:52:01.753782 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:01.753744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" event={"ID":"e5526f92-5731-4f5e-bdbe-860b29954df2","Type":"ContainerStarted","Data":"c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6"} Apr 17 16:52:01.777054 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:01.776996 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podStartSLOduration=5.77697946 podStartE2EDuration="5.77697946s" podCreationTimestamp="2026-04-17 16:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:01.775565908 +0000 UTC m=+1244.363154878" watchObservedRunningTime="2026-04-17 16:52:01.77697946 +0000 UTC m=+1244.364568429" Apr 17 16:52:04.766817 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.766785 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-654699475c-bzbq5_b7b12800-f4b7-48f4-865a-a6f41ca430a3/main/0.log" Apr 17 16:52:04.767236 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.767171 2572 generic.go:358] "Generic (PLEG): container finished" podID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerID="3d00e78f83f57fee772491612eb3fa90c94450417329e3598cb21dcaeb71c0a9" exitCode=137 Apr 17 16:52:04.767300 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.767225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" event={"ID":"b7b12800-f4b7-48f4-865a-a6f41ca430a3","Type":"ContainerDied","Data":"3d00e78f83f57fee772491612eb3fa90c94450417329e3598cb21dcaeb71c0a9"} Apr 17 16:52:04.840094 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.840076 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-654699475c-bzbq5_b7b12800-f4b7-48f4-865a-a6f41ca430a3/main/0.log" Apr 17 16:52:04.840469 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.840452 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:52:04.870364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-dshm\") pod \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " Apr 17 16:52:04.870364 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870367 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-model-cache\") pod \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " Apr 17 16:52:04.870562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b12800-f4b7-48f4-865a-a6f41ca430a3-tls-certs\") pod \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " Apr 17 16:52:04.870562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870478 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-home\") pod \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " Apr 17 16:52:04.870562 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870529 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sww52\" (UniqueName: \"kubernetes.io/projected/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kube-api-access-sww52\") pod \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " Apr 17 16:52:04.870750 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870563 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kserve-provision-location\") pod \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\" (UID: \"b7b12800-f4b7-48f4-865a-a6f41ca430a3\") " Apr 17 16:52:04.870750 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870681 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-model-cache" (OuterVolumeSpecName: "model-cache") pod "b7b12800-f4b7-48f4-865a-a6f41ca430a3" (UID: "b7b12800-f4b7-48f4-865a-a6f41ca430a3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:52:04.870936 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.870909 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:52:04.871824 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.871777 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-home" (OuterVolumeSpecName: "home") pod "b7b12800-f4b7-48f4-865a-a6f41ca430a3" (UID: "b7b12800-f4b7-48f4-865a-a6f41ca430a3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:52:04.872878 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.872845 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-dshm" (OuterVolumeSpecName: "dshm") pod "b7b12800-f4b7-48f4-865a-a6f41ca430a3" (UID: "b7b12800-f4b7-48f4-865a-a6f41ca430a3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:52:04.873134 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.873107 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b12800-f4b7-48f4-865a-a6f41ca430a3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b7b12800-f4b7-48f4-865a-a6f41ca430a3" (UID: "b7b12800-f4b7-48f4-865a-a6f41ca430a3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:52:04.873303 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.873279 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kube-api-access-sww52" (OuterVolumeSpecName: "kube-api-access-sww52") pod "b7b12800-f4b7-48f4-865a-a6f41ca430a3" (UID: "b7b12800-f4b7-48f4-865a-a6f41ca430a3"). InnerVolumeSpecName "kube-api-access-sww52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:52:04.945896 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.945835 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b7b12800-f4b7-48f4-865a-a6f41ca430a3" (UID: "b7b12800-f4b7-48f4-865a-a6f41ca430a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:52:04.971838 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.971809 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:52:04.971838 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.971837 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sww52\" (UniqueName: \"kubernetes.io/projected/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kube-api-access-sww52\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:52:04.971975 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.971849 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:52:04.971975 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.971858 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7b12800-f4b7-48f4-865a-a6f41ca430a3-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:52:04.971975 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:04.971867 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b12800-f4b7-48f4-865a-a6f41ca430a3-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:52:05.772363 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.772335 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-654699475c-bzbq5_b7b12800-f4b7-48f4-865a-a6f41ca430a3/main/0.log" Apr 17 16:52:05.772807 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.772791 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" Apr 17 16:52:05.772898 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.772783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5" event={"ID":"b7b12800-f4b7-48f4-865a-a6f41ca430a3","Type":"ContainerDied","Data":"57916f482bbb3ff62e3ae445896734930f39e40912e42b0e871fca79aac374d7"} Apr 17 16:52:05.772937 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.772927 2572 scope.go:117] "RemoveContainer" containerID="3d00e78f83f57fee772491612eb3fa90c94450417329e3598cb21dcaeb71c0a9" Apr 17 16:52:05.796120 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.796091 2572 scope.go:117] "RemoveContainer" containerID="f889ad793608a8728a01fa2baf2eeb1df9d62055b3af94cdd47e81c7a036d819" Apr 17 16:52:05.796242 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.796166 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5"] Apr 17 16:52:05.798919 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.798897 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-654699475c-bzbq5"] Apr 17 16:52:05.973989 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:05.973957 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" path="/var/lib/kubelet/pods/b7b12800-f4b7-48f4-865a-a6f41ca430a3/volumes" Apr 17 16:52:06.520466 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:06.520436 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:52:06.520662 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:06.520478 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:52:06.522227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:06.522195 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:52:16.520171 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:16.520117 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:52:26.520559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:26.520508 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:52:36.520376 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:36.520324 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:52:46.520124 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:46.520076 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:52:56.519889 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:52:56.519848 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:53:06.520415 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:06.520370 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:53:16.520121 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:16.520081 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:53:22.429706 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.429665 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 16:53:22.430249 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.430229 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="storage-initializer" Apr 17 16:53:22.430339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.430255 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="storage-initializer" Apr 17 16:53:22.430339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.430277 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" Apr 17 16:53:22.430339 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.430285 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" Apr 17 16:53:22.430495 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.430388 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7b12800-f4b7-48f4-865a-a6f41ca430a3" containerName="main" Apr 17 16:53:22.434828 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.434805 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.437197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.437169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-xmkgh\"" Apr 17 16:53:22.437757 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.437737 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 16:53:22.446602 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.446583 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 16:53:22.549859 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.549829 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87edd822-f641-4b20-8bb7-7a7bc47058ab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.549859 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.549871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.550127 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.549889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.550127 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.550026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.550127 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.550084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvw76\" (UniqueName: \"kubernetes.io/projected/87edd822-f641-4b20-8bb7-7a7bc47058ab-kube-api-access-kvw76\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.550262 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.550153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651149 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87edd822-f641-4b20-8bb7-7a7bc47058ab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651149 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvw76\" (UniqueName: \"kubernetes.io/projected/87edd822-f641-4b20-8bb7-7a7bc47058ab-kube-api-access-kvw76\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651673 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651763 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.651818 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.651770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.653514 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.653493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.653842 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.653823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87edd822-f641-4b20-8bb7-7a7bc47058ab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.659412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.659387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvw76\" (UniqueName: \"kubernetes.io/projected/87edd822-f641-4b20-8bb7-7a7bc47058ab-kube-api-access-kvw76\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.747349 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.747270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:22.880898 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.880868 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 16:53:22.882930 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:53:22.882886 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87edd822_f641_4b20_8bb7_7a7bc47058ab.slice/crio-a9d54c5d1f397d991d52312bd9e00d6f3e56ba6569d77e5f2d78d43487a075b9 WatchSource:0}: Error finding container a9d54c5d1f397d991d52312bd9e00d6f3e56ba6569d77e5f2d78d43487a075b9: Status 404 returned error can't find the container with id a9d54c5d1f397d991d52312bd9e00d6f3e56ba6569d77e5f2d78d43487a075b9 Apr 17 16:53:22.884973 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:22.884955 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:53:23.099007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:23.098976 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"87edd822-f641-4b20-8bb7-7a7bc47058ab","Type":"ContainerStarted","Data":"a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6"} Apr 17 16:53:23.099007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:23.099015 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"87edd822-f641-4b20-8bb7-7a7bc47058ab","Type":"ContainerStarted","Data":"a9d54c5d1f397d991d52312bd9e00d6f3e56ba6569d77e5f2d78d43487a075b9"} Apr 17 16:53:26.520580 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:26.520518 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 17 16:53:27.119502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:27.119464 2572 generic.go:358] "Generic (PLEG): container finished" podID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerID="a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6" exitCode=0 Apr 17 16:53:27.119744 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:27.119532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"87edd822-f641-4b20-8bb7-7a7bc47058ab","Type":"ContainerDied","Data":"a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6"} Apr 17 16:53:28.126105 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:28.126066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"87edd822-f641-4b20-8bb7-7a7bc47058ab","Type":"ContainerStarted","Data":"787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b"} Apr 17 16:53:28.148993 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:28.148926 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.148906533 podStartE2EDuration="6.148906533s" podCreationTimestamp="2026-04-17 16:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:28.14621604 +0000 UTC m=+1330.733805012" watchObservedRunningTime="2026-04-17 16:53:28.148906533 +0000 UTC m=+1330.736495541" Apr 17 16:53:32.747815 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:32.747776 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:32.749414 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:32.749374 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:53:36.530780 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:36.530680 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:53:36.538784 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:36.538756 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:53:42.747873 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:42.747823 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:53:52.747588 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:52.747541 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:53:52.748003 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:52.747857 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:53:58.881651 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:58.881616 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg"] Apr 17 16:53:58.882061 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:53:58.881990 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" containerID="cri-o://c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6" gracePeriod=30 Apr 17 16:54:02.748438 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:02.748387 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:54:05.638941 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.638902 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp"] Apr 17 16:54:05.645089 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.645063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.648016 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.647986 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 16:54:05.648148 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.648001 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-szf7n\"" Apr 17 16:54:05.652972 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.652947 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp"] Apr 17 16:54:05.746377 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.746339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz9p4\" (UniqueName: \"kubernetes.io/projected/72151249-717b-4f4d-bb07-b1a2736dc8df-kube-api-access-sz9p4\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.746377 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.746380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-model-cache\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.746606 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.746403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-dshm\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.746606 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.746493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-home\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.746606 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.746524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.746606 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.746547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72151249-717b-4f4d-bb07-b1a2736dc8df-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848013 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.847972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sz9p4\" (UniqueName: \"kubernetes.io/projected/72151249-717b-4f4d-bb07-b1a2736dc8df-kube-api-access-sz9p4\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-model-cache\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-dshm\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-home\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72151249-717b-4f4d-bb07-b1a2736dc8df-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-home\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-model-cache\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.848707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.848655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.850733 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.850692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-dshm\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.850871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.850855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72151249-717b-4f4d-bb07-b1a2736dc8df-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.856812 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.856791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz9p4\" (UniqueName: \"kubernetes.io/projected/72151249-717b-4f4d-bb07-b1a2736dc8df-kube-api-access-sz9p4\") pod \"custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:05.957305 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:05.957225 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:06.096333 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:06.096306 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp"] Apr 17 16:54:06.098623 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:54:06.098578 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72151249_717b_4f4d_bb07_b1a2736dc8df.slice/crio-e79bf503da2e6be4ead5571c7c0c32260533ef2df0c6013645dde74128a07cf9 WatchSource:0}: Error finding container e79bf503da2e6be4ead5571c7c0c32260533ef2df0c6013645dde74128a07cf9: Status 404 returned error can't find the container with id e79bf503da2e6be4ead5571c7c0c32260533ef2df0c6013645dde74128a07cf9 Apr 17 16:54:06.288448 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:06.288361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerStarted","Data":"e79bf503da2e6be4ead5571c7c0c32260533ef2df0c6013645dde74128a07cf9"} Apr 17 16:54:07.295053 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:07.294955 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerStarted","Data":"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0"} Apr 17 16:54:07.295404 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:07.295137 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:08.301670 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:08.301628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerStarted","Data":"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab"} Apr 17 16:54:11.317811 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:11.317781 2572 generic.go:358] "Generic (PLEG): container finished" podID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerID="232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab" exitCode=0 Apr 17 16:54:11.318114 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:11.317867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerDied","Data":"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab"} Apr 17 16:54:12.324503 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:12.324468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerStarted","Data":"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1"} Apr 17 16:54:12.347624 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:12.347551 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podStartSLOduration=6.584108108 podStartE2EDuration="7.347533049s" podCreationTimestamp="2026-04-17 16:54:05 +0000 UTC" firstStartedPulling="2026-04-17 16:54:06.100705011 +0000 UTC m=+1368.688293980" lastFinishedPulling="2026-04-17 16:54:06.864129971 +0000 UTC m=+1369.451718921" observedRunningTime="2026-04-17 16:54:12.34316304 +0000 UTC m=+1374.930752009" watchObservedRunningTime="2026-04-17 16:54:12.347533049 +0000 UTC m=+1374.935122021" Apr 17 16:54:12.748849 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:12.748803 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:54:15.957712 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:15.957668 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:15.958238 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:15.957781 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:15.959257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:15.959232 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:54:16.362426 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:16.362396 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:54:22.748749 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:22.748675 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:54:25.957978 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:25.957929 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:54:29.172395 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.172367 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:54:29.270282 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270246 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8tmn\" (UniqueName: \"kubernetes.io/projected/e5526f92-5731-4f5e-bdbe-860b29954df2-kube-api-access-q8tmn\") pod \"e5526f92-5731-4f5e-bdbe-860b29954df2\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " Apr 17 16:54:29.270455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270301 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-kserve-provision-location\") pod \"e5526f92-5731-4f5e-bdbe-860b29954df2\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " Apr 17 16:54:29.270455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270372 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-model-cache\") pod \"e5526f92-5731-4f5e-bdbe-860b29954df2\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " Apr 17 16:54:29.270455 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270452 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-dshm\") pod \"e5526f92-5731-4f5e-bdbe-860b29954df2\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " Apr 17 16:54:29.270610 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270486 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-home\") pod \"e5526f92-5731-4f5e-bdbe-860b29954df2\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " Apr 17 16:54:29.270610 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270575 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5526f92-5731-4f5e-bdbe-860b29954df2-tls-certs\") pod \"e5526f92-5731-4f5e-bdbe-860b29954df2\" (UID: \"e5526f92-5731-4f5e-bdbe-860b29954df2\") " Apr 17 16:54:29.270747 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270608 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-model-cache" (OuterVolumeSpecName: "model-cache") pod "e5526f92-5731-4f5e-bdbe-860b29954df2" (UID: "e5526f92-5731-4f5e-bdbe-860b29954df2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.271006 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.270975 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-home" (OuterVolumeSpecName: "home") pod "e5526f92-5731-4f5e-bdbe-860b29954df2" (UID: "e5526f92-5731-4f5e-bdbe-860b29954df2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.271359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.271337 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.271454 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.271367 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.272611 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.272585 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5526f92-5731-4f5e-bdbe-860b29954df2-kube-api-access-q8tmn" (OuterVolumeSpecName: "kube-api-access-q8tmn") pod "e5526f92-5731-4f5e-bdbe-860b29954df2" (UID: "e5526f92-5731-4f5e-bdbe-860b29954df2"). InnerVolumeSpecName "kube-api-access-q8tmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:29.272980 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.272956 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-dshm" (OuterVolumeSpecName: "dshm") pod "e5526f92-5731-4f5e-bdbe-860b29954df2" (UID: "e5526f92-5731-4f5e-bdbe-860b29954df2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.273074 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.273050 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5526f92-5731-4f5e-bdbe-860b29954df2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e5526f92-5731-4f5e-bdbe-860b29954df2" (UID: "e5526f92-5731-4f5e-bdbe-860b29954df2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:29.320436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.320390 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5526f92-5731-4f5e-bdbe-860b29954df2" (UID: "e5526f92-5731-4f5e-bdbe-860b29954df2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.372733 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.372685 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5526f92-5731-4f5e-bdbe-860b29954df2-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.372733 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.372731 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8tmn\" (UniqueName: \"kubernetes.io/projected/e5526f92-5731-4f5e-bdbe-860b29954df2-kube-api-access-q8tmn\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.372733 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.372742 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.372974 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.372751 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5526f92-5731-4f5e-bdbe-860b29954df2-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.420393 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.420321 2572 generic.go:358] "Generic (PLEG): container finished" podID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerID="c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6" exitCode=137 Apr 17 16:54:29.420525 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.420426 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" Apr 17 16:54:29.420525 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.420419 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" event={"ID":"e5526f92-5731-4f5e-bdbe-860b29954df2","Type":"ContainerDied","Data":"c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6"} Apr 17 16:54:29.420612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.420527 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg" event={"ID":"e5526f92-5731-4f5e-bdbe-860b29954df2","Type":"ContainerDied","Data":"578041ce6e2f0c384780c8c1eaccf501ac435fcf62fd24bd7fcaf6187e8c962e"} Apr 17 16:54:29.420612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.420546 2572 scope.go:117] "RemoveContainer" containerID="c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6" Apr 17 16:54:29.440574 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.440551 2572 scope.go:117] "RemoveContainer" containerID="7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3" Apr 17 16:54:29.451854 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.451828 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg"] Apr 17 16:54:29.454213 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.454194 2572 scope.go:117] "RemoveContainer" containerID="c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6" Apr 17 16:54:29.454510 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:54:29.454490 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6\": container with ID starting with c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6 not found: ID does not exist" containerID="c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6" Apr 17 16:54:29.454603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.454520 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6"} err="failed to get container status \"c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6\": rpc error: code = NotFound desc = could not find container \"c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6\": container with ID starting with c4459ef4b1a8a4b765d6a4ba194620a0df49ca0fda791bbe363bb35fc48a9ad6 not found: ID does not exist" Apr 17 16:54:29.454603 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.454544 2572 scope.go:117] "RemoveContainer" containerID="7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3" Apr 17 16:54:29.454923 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:54:29.454898 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3\": container with ID starting with 7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3 not found: ID does not exist" containerID="7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3" Apr 17 16:54:29.455007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.454934 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3"} err="failed to get container status \"7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3\": rpc error: code = NotFound desc = could not find container \"7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3\": container with ID starting with 7a4135e9ce96eb645f435e9d5f64737b5f2cb544885a3d1d1f2422fa5bf55be3 not found: ID does not exist" Apr 17 16:54:29.456190 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.456166 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-74njplg"] Apr 17 16:54:29.975227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:29.975197 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" path="/var/lib/kubelet/pods/e5526f92-5731-4f5e-bdbe-860b29954df2/volumes" Apr 17 16:54:32.748568 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:32.748526 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:54:35.958661 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:35.958583 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:54:42.748385 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:42.748344 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:54:45.957956 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:45.957905 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:54:52.748682 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:52.748637 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8000/health\": dial tcp 10.133.0.53:8000: connect: connection refused" Apr 17 16:54:55.957846 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:54:55.957781 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:55:02.763832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:02.763768 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:55:02.776134 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:02.776082 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:55:05.958646 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:05.958544 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:55:13.458110 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:13.458074 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 16:55:13.458610 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:13.458430 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" containerID="cri-o://787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b" gracePeriod=30 Apr 17 16:55:14.510301 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.510278 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:55:14.609994 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.609964 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87edd822-f641-4b20-8bb7-7a7bc47058ab-tls-certs\") pod \"87edd822-f641-4b20-8bb7-7a7bc47058ab\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " Apr 17 16:55:14.610175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.609998 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-dshm\") pod \"87edd822-f641-4b20-8bb7-7a7bc47058ab\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " Apr 17 16:55:14.610175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610027 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-home\") pod \"87edd822-f641-4b20-8bb7-7a7bc47058ab\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " Apr 17 16:55:14.610175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610045 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-kserve-provision-location\") pod \"87edd822-f641-4b20-8bb7-7a7bc47058ab\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " Apr 17 16:55:14.610175 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610096 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvw76\" (UniqueName: \"kubernetes.io/projected/87edd822-f641-4b20-8bb7-7a7bc47058ab-kube-api-access-kvw76\") pod \"87edd822-f641-4b20-8bb7-7a7bc47058ab\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " Apr 17 16:55:14.610414 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610193 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-model-cache\") pod \"87edd822-f641-4b20-8bb7-7a7bc47058ab\" (UID: \"87edd822-f641-4b20-8bb7-7a7bc47058ab\") " Apr 17 16:55:14.610414 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610358 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-home" (OuterVolumeSpecName: "home") pod "87edd822-f641-4b20-8bb7-7a7bc47058ab" (UID: "87edd822-f641-4b20-8bb7-7a7bc47058ab"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.610519 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610465 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.610519 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.610512 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-model-cache" (OuterVolumeSpecName: "model-cache") pod "87edd822-f641-4b20-8bb7-7a7bc47058ab" (UID: "87edd822-f641-4b20-8bb7-7a7bc47058ab"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.612200 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.612163 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-dshm" (OuterVolumeSpecName: "dshm") pod "87edd822-f641-4b20-8bb7-7a7bc47058ab" (UID: "87edd822-f641-4b20-8bb7-7a7bc47058ab"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.612318 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.612265 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87edd822-f641-4b20-8bb7-7a7bc47058ab-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "87edd822-f641-4b20-8bb7-7a7bc47058ab" (UID: "87edd822-f641-4b20-8bb7-7a7bc47058ab"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:14.612370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.612357 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87edd822-f641-4b20-8bb7-7a7bc47058ab-kube-api-access-kvw76" (OuterVolumeSpecName: "kube-api-access-kvw76") pod "87edd822-f641-4b20-8bb7-7a7bc47058ab" (UID: "87edd822-f641-4b20-8bb7-7a7bc47058ab"). InnerVolumeSpecName "kube-api-access-kvw76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:14.619873 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.619845 2572 generic.go:358] "Generic (PLEG): container finished" podID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerID="787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b" exitCode=0 Apr 17 16:55:14.619975 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.619903 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 16:55:14.619975 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.619917 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"87edd822-f641-4b20-8bb7-7a7bc47058ab","Type":"ContainerDied","Data":"787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b"} Apr 17 16:55:14.619975 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.619962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"87edd822-f641-4b20-8bb7-7a7bc47058ab","Type":"ContainerDied","Data":"a9d54c5d1f397d991d52312bd9e00d6f3e56ba6569d77e5f2d78d43487a075b9"} Apr 17 16:55:14.620128 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.619983 2572 scope.go:117] "RemoveContainer" containerID="787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b" Apr 17 16:55:14.644502 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.644480 2572 scope.go:117] "RemoveContainer" containerID="a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6" Apr 17 16:55:14.682738 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.682688 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87edd822-f641-4b20-8bb7-7a7bc47058ab" (UID: "87edd822-f641-4b20-8bb7-7a7bc47058ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.706366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.706339 2572 scope.go:117] "RemoveContainer" containerID="787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b" Apr 17 16:55:14.706693 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:55:14.706673 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b\": container with ID starting with 787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b not found: ID does not exist" containerID="787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b" Apr 17 16:55:14.706794 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.706704 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b"} err="failed to get container status \"787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b\": rpc error: code = NotFound desc = could not find container \"787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b\": container with ID starting with 787ceeb14f41b34c6c3be0eea225552fbf433064e5367d962b1de395b511be0b not found: ID does not exist" Apr 17 16:55:14.706794 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.706742 2572 scope.go:117] "RemoveContainer" containerID="a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6" Apr 17 16:55:14.707007 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:55:14.706991 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6\": container with ID starting with a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6 not found: ID does not exist" containerID="a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6" Apr 17 16:55:14.707050 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.707011 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6"} err="failed to get container status \"a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6\": rpc error: code = NotFound desc = could not find container \"a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6\": container with ID starting with a7ceb24c90abebefee05513df884f788aaaa322d9e47bc10a6addcf1c7d106c6 not found: ID does not exist" Apr 17 16:55:14.711458 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.711440 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvw76\" (UniqueName: \"kubernetes.io/projected/87edd822-f641-4b20-8bb7-7a7bc47058ab-kube-api-access-kvw76\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.711520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.711461 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.711520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.711471 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87edd822-f641-4b20-8bb7-7a7bc47058ab-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.711520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.711479 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.711520 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.711487 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87edd822-f641-4b20-8bb7-7a7bc47058ab-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.943819 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.943785 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 16:55:14.947571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:14.947548 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 16:55:15.958287 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:15.958239 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:55:15.975268 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:15.975236 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" path="/var/lib/kubelet/pods/87edd822-f641-4b20-8bb7-7a7bc47058ab/volumes" Apr 17 16:55:25.958341 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:25.958280 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:55:35.958021 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:35.957968 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:55:45.958478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:45.958415 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8001/health\": dial tcp 10.133.0.54:8001: connect: connection refused" Apr 17 16:55:55.968128 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:55.968096 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:55:55.980888 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:55:55.980861 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:56:07.128334 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:07.128301 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp"] Apr 17 16:56:07.128703 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:07.128642 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" containerID="cri-o://4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1" gracePeriod=30 Apr 17 16:56:15.509128 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509095 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz"] Apr 17 16:56:15.509497 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509474 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="storage-initializer" Apr 17 16:56:15.509497 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509490 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="storage-initializer" Apr 17 16:56:15.509571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509502 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" Apr 17 16:56:15.509571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509507 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" Apr 17 16:56:15.509571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509513 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="storage-initializer" Apr 17 16:56:15.509571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509520 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="storage-initializer" Apr 17 16:56:15.509571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509527 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" Apr 17 16:56:15.509571 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509533 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" Apr 17 16:56:15.509789 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509592 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5526f92-5731-4f5e-bdbe-860b29954df2" containerName="main" Apr 17 16:56:15.509789 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.509602 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="87edd822-f641-4b20-8bb7-7a7bc47058ab" containerName="main" Apr 17 16:56:15.512482 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.512466 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.514825 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.514804 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-q6jvl\"" Apr 17 16:56:15.514985 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.514967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 17 16:56:15.527645 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.527622 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz"] Apr 17 16:56:15.668630 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.668604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-dshm\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.668793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.668646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/825f2b45-8699-4e88-9f42-348c0b14ba6f-tls-certs\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.668793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.668679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-home\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.668793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.668697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs64n\" (UniqueName: \"kubernetes.io/projected/825f2b45-8699-4e88-9f42-348c0b14ba6f-kube-api-access-gs64n\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.668793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.668731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.668793 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.668759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-model-cache\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.769914 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.769826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-home\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.769914 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.769872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs64n\" (UniqueName: \"kubernetes.io/projected/825f2b45-8699-4e88-9f42-348c0b14ba6f-kube-api-access-gs64n\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.769914 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.769894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.769914 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.769923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-model-cache\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.770236 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.769960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-dshm\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.770236 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.770000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/825f2b45-8699-4e88-9f42-348c0b14ba6f-tls-certs\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.770345 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.770267 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-home\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.770345 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.770277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.770436 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.770388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-model-cache\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.772319 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.772296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-dshm\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.772578 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.772562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/825f2b45-8699-4e88-9f42-348c0b14ba6f-tls-certs\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.777670 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.777649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs64n\" (UniqueName: \"kubernetes.io/projected/825f2b45-8699-4e88-9f42-348c0b14ba6f-kube-api-access-gs64n\") pod \"router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.821773 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.821739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:15.953282 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:15.953247 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz"] Apr 17 16:56:15.955729 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:56:15.955688 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825f2b45_8699_4e88_9f42_348c0b14ba6f.slice/crio-5b390f15ff02934de418cabfe741ba3d213672269921c66a9b9a24de5175de48 WatchSource:0}: Error finding container 5b390f15ff02934de418cabfe741ba3d213672269921c66a9b9a24de5175de48: Status 404 returned error can't find the container with id 5b390f15ff02934de418cabfe741ba3d213672269921c66a9b9a24de5175de48 Apr 17 16:56:16.869787 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:16.869745 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerStarted","Data":"7b1b27c837c75eb1e6bcce5c41bd3a4c8c7a1a6e064df79e655d75e1d613c819"} Apr 17 16:56:16.870161 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:16.869794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerStarted","Data":"5b390f15ff02934de418cabfe741ba3d213672269921c66a9b9a24de5175de48"} Apr 17 16:56:16.870161 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:16.869896 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:17.875753 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:17.875697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerStarted","Data":"06dc6c7189cb1fb35e641bcb05c273d35072f9eb6b65455e46c26f8c704cf8a7"} Apr 17 16:56:17.977812 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:17.977781 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:56:17.983995 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:17.983971 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:56:17.985201 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:17.985183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:56:17.991088 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:17.991068 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:56:20.890159 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:20.890120 2572 generic.go:358] "Generic (PLEG): container finished" podID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerID="06dc6c7189cb1fb35e641bcb05c273d35072f9eb6b65455e46c26f8c704cf8a7" exitCode=0 Apr 17 16:56:20.890574 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:20.890191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerDied","Data":"06dc6c7189cb1fb35e641bcb05c273d35072f9eb6b65455e46c26f8c704cf8a7"} Apr 17 16:56:21.897622 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:21.897580 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerStarted","Data":"5e5e33a610a0dd3780154ba87a6da35c9815810ac1b5fc54b85d93c77a89dba3"} Apr 17 16:56:21.922412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:21.922356 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podStartSLOduration=6.922338749 podStartE2EDuration="6.922338749s" podCreationTimestamp="2026-04-17 16:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:56:21.918517386 +0000 UTC m=+1504.506106356" watchObservedRunningTime="2026-04-17 16:56:21.922338749 +0000 UTC m=+1504.509927720" Apr 17 16:56:25.822219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:25.822179 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:25.822219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:25.822223 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:25.823835 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:25.823796 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:56:35.822878 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:35.822782 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:56:35.842094 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:35.842066 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:56:37.129039 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.128992 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="llm-d-routing-sidecar" containerID="cri-o://bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0" gracePeriod=2 Apr 17 16:56:37.426880 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.426856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp_72151249-717b-4f4d-bb07-b1a2736dc8df/main/0.log" Apr 17 16:56:37.427508 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.427490 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:56:37.473165 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473140 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-kserve-provision-location\") pod \"72151249-717b-4f4d-bb07-b1a2736dc8df\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " Apr 17 16:56:37.473337 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473178 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz9p4\" (UniqueName: \"kubernetes.io/projected/72151249-717b-4f4d-bb07-b1a2736dc8df-kube-api-access-sz9p4\") pod \"72151249-717b-4f4d-bb07-b1a2736dc8df\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " Apr 17 16:56:37.473337 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473224 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-model-cache\") pod \"72151249-717b-4f4d-bb07-b1a2736dc8df\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " Apr 17 16:56:37.473337 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473245 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-dshm\") pod \"72151249-717b-4f4d-bb07-b1a2736dc8df\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " Apr 17 16:56:37.473337 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473294 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72151249-717b-4f4d-bb07-b1a2736dc8df-tls-certs\") pod \"72151249-717b-4f4d-bb07-b1a2736dc8df\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " Apr 17 16:56:37.473557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473360 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-home\") pod \"72151249-717b-4f4d-bb07-b1a2736dc8df\" (UID: \"72151249-717b-4f4d-bb07-b1a2736dc8df\") " Apr 17 16:56:37.473557 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473471 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-model-cache" (OuterVolumeSpecName: "model-cache") pod "72151249-717b-4f4d-bb07-b1a2736dc8df" (UID: "72151249-717b-4f4d-bb07-b1a2736dc8df"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.473674 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473656 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.473816 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.473793 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-home" (OuterVolumeSpecName: "home") pod "72151249-717b-4f4d-bb07-b1a2736dc8df" (UID: "72151249-717b-4f4d-bb07-b1a2736dc8df"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.475776 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.475735 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72151249-717b-4f4d-bb07-b1a2736dc8df-kube-api-access-sz9p4" (OuterVolumeSpecName: "kube-api-access-sz9p4") pod "72151249-717b-4f4d-bb07-b1a2736dc8df" (UID: "72151249-717b-4f4d-bb07-b1a2736dc8df"). InnerVolumeSpecName "kube-api-access-sz9p4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:56:37.475871 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.475810 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-dshm" (OuterVolumeSpecName: "dshm") pod "72151249-717b-4f4d-bb07-b1a2736dc8df" (UID: "72151249-717b-4f4d-bb07-b1a2736dc8df"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.476060 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.476033 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72151249-717b-4f4d-bb07-b1a2736dc8df-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "72151249-717b-4f4d-bb07-b1a2736dc8df" (UID: "72151249-717b-4f4d-bb07-b1a2736dc8df"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:56:37.528075 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.528043 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72151249-717b-4f4d-bb07-b1a2736dc8df" (UID: "72151249-717b-4f4d-bb07-b1a2736dc8df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.574258 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.574223 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.574383 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.574261 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.574383 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.574278 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sz9p4\" (UniqueName: \"kubernetes.io/projected/72151249-717b-4f4d-bb07-b1a2736dc8df-kube-api-access-sz9p4\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.574383 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.574293 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72151249-717b-4f4d-bb07-b1a2736dc8df-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.574383 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.574309 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72151249-717b-4f4d-bb07-b1a2736dc8df-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.972452 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.972417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp_72151249-717b-4f4d-bb07-b1a2736dc8df/main/0.log" Apr 17 16:56:37.973071 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.973041 2572 generic.go:358] "Generic (PLEG): container finished" podID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerID="4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1" exitCode=137 Apr 17 16:56:37.973071 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.973070 2572 generic.go:358] "Generic (PLEG): container finished" podID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerID="bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0" exitCode=0 Apr 17 16:56:37.973215 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.973152 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" Apr 17 16:56:37.974605 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.974576 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerDied","Data":"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1"} Apr 17 16:56:37.974693 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.974616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerDied","Data":"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0"} Apr 17 16:56:37.974693 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.974633 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp" event={"ID":"72151249-717b-4f4d-bb07-b1a2736dc8df","Type":"ContainerDied","Data":"e79bf503da2e6be4ead5571c7c0c32260533ef2df0c6013645dde74128a07cf9"} Apr 17 16:56:37.974693 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.974656 2572 scope.go:117] "RemoveContainer" containerID="4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1" Apr 17 16:56:37.998417 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:37.998396 2572 scope.go:117] "RemoveContainer" containerID="232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab" Apr 17 16:56:38.005940 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.005909 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp"] Apr 17 16:56:38.008199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.008177 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-b448bbb4f-jnskp"] Apr 17 16:56:38.060283 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.060262 2572 scope.go:117] "RemoveContainer" containerID="bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0" Apr 17 16:56:38.068774 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.068754 2572 scope.go:117] "RemoveContainer" containerID="4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1" Apr 17 16:56:38.069069 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:56:38.069046 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1\": container with ID starting with 4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1 not found: ID does not exist" containerID="4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1" Apr 17 16:56:38.069133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069079 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1"} err="failed to get container status \"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1\": rpc error: code = NotFound desc = could not find container \"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1\": container with ID starting with 4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1 not found: ID does not exist" Apr 17 16:56:38.069133 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069100 2572 scope.go:117] "RemoveContainer" containerID="232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab" Apr 17 16:56:38.069324 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:56:38.069310 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab\": container with ID starting with 232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab not found: ID does not exist" containerID="232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab" Apr 17 16:56:38.069367 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069328 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab"} err="failed to get container status \"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab\": rpc error: code = NotFound desc = could not find container \"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab\": container with ID starting with 232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab not found: ID does not exist" Apr 17 16:56:38.069367 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069339 2572 scope.go:117] "RemoveContainer" containerID="bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0" Apr 17 16:56:38.069571 ip-10-0-131-177 kubenswrapper[2572]: E0417 16:56:38.069550 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0\": container with ID starting with bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0 not found: ID does not exist" containerID="bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0" Apr 17 16:56:38.069612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069577 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0"} err="failed to get container status \"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0\": rpc error: code = NotFound desc = could not find container \"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0\": container with ID starting with bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0 not found: ID does not exist" Apr 17 16:56:38.069612 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069593 2572 scope.go:117] "RemoveContainer" containerID="4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1" Apr 17 16:56:38.069817 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069793 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1"} err="failed to get container status \"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1\": rpc error: code = NotFound desc = could not find container \"4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1\": container with ID starting with 4202af2f6d2d5de316ab837760d404a054db9eeee2a261385c10fe08f18fb0c1 not found: ID does not exist" Apr 17 16:56:38.069902 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.069818 2572 scope.go:117] "RemoveContainer" containerID="232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab" Apr 17 16:56:38.070043 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.070026 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab"} err="failed to get container status \"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab\": rpc error: code = NotFound desc = could not find container \"232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab\": container with ID starting with 232f346cb21a4eb13c8add8659491175230b97df969da218e9acea77a66730ab not found: ID does not exist" Apr 17 16:56:38.070109 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.070044 2572 scope.go:117] "RemoveContainer" containerID="bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0" Apr 17 16:56:38.070264 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:38.070245 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0"} err="failed to get container status \"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0\": rpc error: code = NotFound desc = could not find container \"bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0\": container with ID starting with bbdf5a553bba8f0db0b1dbdd6e4a10ac9ca5a4fe8027caf00b38a1cbe71c27a0 not found: ID does not exist" Apr 17 16:56:39.974064 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:39.974024 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" path="/var/lib/kubelet/pods/72151249-717b-4f4d-bb07-b1a2736dc8df/volumes" Apr 17 16:56:45.823078 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:45.823036 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:56:55.822393 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:56:55.822343 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:57:05.822404 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:05.822355 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:57:15.822395 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:15.822343 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:57:25.823016 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:25.822965 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:57:35.823117 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:35.823069 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:57:45.822169 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:45.822109 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 17 16:57:55.832257 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:55.832224 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:57:55.845343 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:57:55.845311 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:58:07.219200 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:07.219168 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz"] Apr 17 16:58:07.219653 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:07.219591 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" containerID="cri-o://5e5e33a610a0dd3780154ba87a6da35c9815810ac1b5fc54b85d93c77a89dba3" gracePeriod=30 Apr 17 16:58:22.429401 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:22.429370 2572 ???:1] "http2: server: error reading preface from client 10.0.131.177:43312: read tcp 10.0.131.177:10250->10.0.131.177:43312: read: connection reset by peer" Apr 17 16:58:22.441658 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:22.441632 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:22.468227 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:22.468180 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:22.481549 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:22.481523 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:23.487464 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:23.487435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:23.496632 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:23.496610 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:23.509690 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:23.509648 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:24.539760 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:24.539702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:24.547336 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:24.547308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:24.557900 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:24.557872 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:25.586672 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:25.586640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:25.595018 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:25.594994 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:25.607039 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:25.607015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:26.593231 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:26.593197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:26.600764 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:26.600744 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:26.611585 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:26.611562 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:27.640545 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:27.640513 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:27.649086 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:27.649054 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:27.659825 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:27.659796 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:28.662103 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:28.662075 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:28.669310 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:28.669286 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:28.680619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:28.680594 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:29.671199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:29.671162 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:29.678220 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:29.678197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:29.689808 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:29.689781 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:30.684119 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:30.684082 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:30.692790 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:30.692761 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:30.703747 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:30.703710 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:31.718568 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:31.718537 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:31.726533 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:31.726509 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:31.738356 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:31.738334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:32.758915 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:32.758885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:32.769501 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:32.769456 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:32.786130 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:32.786110 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:33.864076 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:33.864043 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:33.873818 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:33.873789 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:33.897166 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:33.897148 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:35.025537 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:35.025502 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:35.034832 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:35.034807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:35.047548 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:35.047526 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:36.058122 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:36.058078 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:36.065496 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:36.065474 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/llm-d-routing-sidecar/0.log" Apr 17 16:58:36.077548 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:36.077529 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/storage-initializer/0.log" Apr 17 16:58:37.220033 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.219989 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="llm-d-routing-sidecar" containerID="cri-o://7b1b27c837c75eb1e6bcce5c41bd3a4c8c7a1a6e064df79e655d75e1d613c819" gracePeriod=2 Apr 17 16:58:37.448795 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.448765 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:37.449478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.449451 2572 generic.go:358] "Generic (PLEG): container finished" podID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerID="5e5e33a610a0dd3780154ba87a6da35c9815810ac1b5fc54b85d93c77a89dba3" exitCode=137 Apr 17 16:58:37.449478 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.449471 2572 generic.go:358] "Generic (PLEG): container finished" podID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerID="7b1b27c837c75eb1e6bcce5c41bd3a4c8c7a1a6e064df79e655d75e1d613c819" exitCode=0 Apr 17 16:58:37.449671 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.449523 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerDied","Data":"5e5e33a610a0dd3780154ba87a6da35c9815810ac1b5fc54b85d93c77a89dba3"} Apr 17 16:58:37.449671 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.449563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerDied","Data":"7b1b27c837c75eb1e6bcce5c41bd3a4c8c7a1a6e064df79e655d75e1d613c819"} Apr 17 16:58:37.467158 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.467139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:37.467750 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.467734 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:58:37.548361 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.548284 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-kserve-provision-location\") pod \"825f2b45-8699-4e88-9f42-348c0b14ba6f\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " Apr 17 16:58:37.548361 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.548322 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/825f2b45-8699-4e88-9f42-348c0b14ba6f-tls-certs\") pod \"825f2b45-8699-4e88-9f42-348c0b14ba6f\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " Apr 17 16:58:37.548361 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.548343 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-dshm\") pod \"825f2b45-8699-4e88-9f42-348c0b14ba6f\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " Apr 17 16:58:37.548627 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.548370 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs64n\" (UniqueName: \"kubernetes.io/projected/825f2b45-8699-4e88-9f42-348c0b14ba6f-kube-api-access-gs64n\") pod \"825f2b45-8699-4e88-9f42-348c0b14ba6f\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " Apr 17 16:58:37.548627 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.548505 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-home\") pod \"825f2b45-8699-4e88-9f42-348c0b14ba6f\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " Apr 17 16:58:37.548627 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.548606 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-model-cache\") pod \"825f2b45-8699-4e88-9f42-348c0b14ba6f\" (UID: \"825f2b45-8699-4e88-9f42-348c0b14ba6f\") " Apr 17 16:58:37.549359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.549323 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-home" (OuterVolumeSpecName: "home") pod "825f2b45-8699-4e88-9f42-348c0b14ba6f" (UID: "825f2b45-8699-4e88-9f42-348c0b14ba6f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.549359 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.549337 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-model-cache" (OuterVolumeSpecName: "model-cache") pod "825f2b45-8699-4e88-9f42-348c0b14ba6f" (UID: "825f2b45-8699-4e88-9f42-348c0b14ba6f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.550680 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.550652 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-dshm" (OuterVolumeSpecName: "dshm") pod "825f2b45-8699-4e88-9f42-348c0b14ba6f" (UID: "825f2b45-8699-4e88-9f42-348c0b14ba6f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.550813 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.550760 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825f2b45-8699-4e88-9f42-348c0b14ba6f-kube-api-access-gs64n" (OuterVolumeSpecName: "kube-api-access-gs64n") pod "825f2b45-8699-4e88-9f42-348c0b14ba6f" (UID: "825f2b45-8699-4e88-9f42-348c0b14ba6f"). InnerVolumeSpecName "kube-api-access-gs64n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:37.550813 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.550801 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825f2b45-8699-4e88-9f42-348c0b14ba6f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "825f2b45-8699-4e88-9f42-348c0b14ba6f" (UID: "825f2b45-8699-4e88-9f42-348c0b14ba6f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:58:37.606139 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.606105 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "825f2b45-8699-4e88-9f42-348c0b14ba6f" (UID: "825f2b45-8699-4e88-9f42-348c0b14ba6f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.649864 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.649828 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-model-cache\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.649864 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.649861 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-kserve-provision-location\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.649864 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.649871 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/825f2b45-8699-4e88-9f42-348c0b14ba6f-tls-certs\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.650069 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.649880 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-dshm\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.650069 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.649889 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gs64n\" (UniqueName: \"kubernetes.io/projected/825f2b45-8699-4e88-9f42-348c0b14ba6f-kube-api-access-gs64n\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.650069 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:37.649896 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/825f2b45-8699-4e88-9f42-348c0b14ba6f-home\") on node \"ip-10-0-131-177.ec2.internal\" DevicePath \"\"" Apr 17 16:58:38.454764 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.454710 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz_825f2b45-8699-4e88-9f42-348c0b14ba6f/main/0.log" Apr 17 16:58:38.455352 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.455329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" event={"ID":"825f2b45-8699-4e88-9f42-348c0b14ba6f","Type":"ContainerDied","Data":"5b390f15ff02934de418cabfe741ba3d213672269921c66a9b9a24de5175de48"} Apr 17 16:58:38.455406 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.455379 2572 scope.go:117] "RemoveContainer" containerID="5e5e33a610a0dd3780154ba87a6da35c9815810ac1b5fc54b85d93c77a89dba3" Apr 17 16:58:38.455406 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.455396 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz" Apr 17 16:58:38.474341 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.474321 2572 scope.go:117] "RemoveContainer" containerID="06dc6c7189cb1fb35e641bcb05c273d35072f9eb6b65455e46c26f8c704cf8a7" Apr 17 16:58:38.483492 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.483468 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz"] Apr 17 16:58:38.490685 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.490662 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-77f9b5fb4f-4rvmz"] Apr 17 16:58:38.543531 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.543500 2572 scope.go:117] "RemoveContainer" containerID="7b1b27c837c75eb1e6bcce5c41bd3a4c8c7a1a6e064df79e655d75e1d613c819" Apr 17 16:58:38.798234 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.798156 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4mfhw_7923ca69-0185-41c6-84a2-64e801337d1d/kuadrant-console-plugin/0.log" Apr 17 16:58:38.845961 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:38.845932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-nw4zx_94060e29-bfc5-4851-8bd6-bc14a5f01acc/limitador/0.log" Apr 17 16:58:39.974084 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:39.974048 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" path="/var/lib/kubelet/pods/825f2b45-8699-4e88-9f42-348c0b14ba6f/volumes" Apr 17 16:58:41.171448 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171408 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2whr7/must-gather-zrhl2"] Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171768 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171779 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171789 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171794 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171804 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="llm-d-routing-sidecar" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171810 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="llm-d-routing-sidecar" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171818 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="storage-initializer" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171824 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="storage-initializer" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171833 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="llm-d-routing-sidecar" Apr 17 16:58:41.171843 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171838 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="llm-d-routing-sidecar" Apr 17 16:58:41.172171 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.171852 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="storage-initializer" Apr 17 16:58:41.172299 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.172279 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="storage-initializer" Apr 17 16:58:41.172645 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.172621 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="llm-d-routing-sidecar" Apr 17 16:58:41.172711 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.172662 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="llm-d-routing-sidecar" Apr 17 16:58:41.172711 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.172685 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="72151249-717b-4f4d-bb07-b1a2736dc8df" containerName="main" Apr 17 16:58:41.172711 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.172700 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="825f2b45-8699-4e88-9f42-348c0b14ba6f" containerName="main" Apr 17 16:58:41.179953 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.179931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.182333 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.182311 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2whr7\"/\"default-dockercfg-8vtfj\"" Apr 17 16:58:41.182333 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.182329 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2whr7\"/\"openshift-service-ca.crt\"" Apr 17 16:58:41.182544 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.182313 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2whr7\"/\"kube-root-ca.crt\"" Apr 17 16:58:41.184427 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.184405 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/must-gather-zrhl2"] Apr 17 16:58:41.280583 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.280538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb3a313-8e40-467e-a609-40863737dc74-must-gather-output\") pod \"must-gather-zrhl2\" (UID: \"dfb3a313-8e40-467e-a609-40863737dc74\") " pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.280795 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.280606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x6l9\" (UniqueName: \"kubernetes.io/projected/dfb3a313-8e40-467e-a609-40863737dc74-kube-api-access-6x6l9\") pod \"must-gather-zrhl2\" (UID: \"dfb3a313-8e40-467e-a609-40863737dc74\") " pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.381370 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.381337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb3a313-8e40-467e-a609-40863737dc74-must-gather-output\") pod \"must-gather-zrhl2\" (UID: \"dfb3a313-8e40-467e-a609-40863737dc74\") " pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.381550 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.381380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x6l9\" (UniqueName: \"kubernetes.io/projected/dfb3a313-8e40-467e-a609-40863737dc74-kube-api-access-6x6l9\") pod \"must-gather-zrhl2\" (UID: \"dfb3a313-8e40-467e-a609-40863737dc74\") " pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.381742 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.381700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb3a313-8e40-467e-a609-40863737dc74-must-gather-output\") pod \"must-gather-zrhl2\" (UID: \"dfb3a313-8e40-467e-a609-40863737dc74\") " pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.396239 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.396213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x6l9\" (UniqueName: \"kubernetes.io/projected/dfb3a313-8e40-467e-a609-40863737dc74-kube-api-access-6x6l9\") pod \"must-gather-zrhl2\" (UID: \"dfb3a313-8e40-467e-a609-40863737dc74\") " pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.491707 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.491635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/must-gather-zrhl2" Apr 17 16:58:41.619870 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.619845 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/must-gather-zrhl2"] Apr 17 16:58:41.621801 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:58:41.621771 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb3a313_8e40_467e_a609_40863737dc74.slice/crio-d4bb3c703ec52588ae9240ba3cfa6528c39fcbb8da2f4690f71364a783b910e1 WatchSource:0}: Error finding container d4bb3c703ec52588ae9240ba3cfa6528c39fcbb8da2f4690f71364a783b910e1: Status 404 returned error can't find the container with id d4bb3c703ec52588ae9240ba3cfa6528c39fcbb8da2f4690f71364a783b910e1 Apr 17 16:58:41.623414 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:41.623394 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:58:42.475211 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:42.475171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/must-gather-zrhl2" event={"ID":"dfb3a313-8e40-467e-a609-40863737dc74","Type":"ContainerStarted","Data":"d59381ebd6a75a74c31f41ffbf4d2880864009597a0091d7fb13c3e24f505262"} Apr 17 16:58:42.475624 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:42.475216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/must-gather-zrhl2" event={"ID":"dfb3a313-8e40-467e-a609-40863737dc74","Type":"ContainerStarted","Data":"d4bb3c703ec52588ae9240ba3cfa6528c39fcbb8da2f4690f71364a783b910e1"} Apr 17 16:58:43.482400 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:43.482356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/must-gather-zrhl2" event={"ID":"dfb3a313-8e40-467e-a609-40863737dc74","Type":"ContainerStarted","Data":"5b47346e6819eb84e0a3c932572aa9bf7bac87e9df735bd54c2e4b30cb602312"} Apr 17 16:58:43.500629 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:43.500579 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2whr7/must-gather-zrhl2" podStartSLOduration=1.798500882 podStartE2EDuration="2.500564281s" podCreationTimestamp="2026-04-17 16:58:41 +0000 UTC" firstStartedPulling="2026-04-17 16:58:41.623532479 +0000 UTC m=+1644.211121428" lastFinishedPulling="2026-04-17 16:58:42.325595879 +0000 UTC m=+1644.913184827" observedRunningTime="2026-04-17 16:58:43.498066802 +0000 UTC m=+1646.085655772" watchObservedRunningTime="2026-04-17 16:58:43.500564281 +0000 UTC m=+1646.088153294" Apr 17 16:58:43.922156 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:43.922121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rxw7n_6644c140-1be7-4a25-becc-56acd5a0f042/global-pull-secret-syncer/0.log" Apr 17 16:58:43.991161 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:43.991124 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-clpwx_e7e38ded-7b99-4b86-9ba3-f0cdd8e37344/konnectivity-agent/0.log" Apr 17 16:58:44.106922 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:44.106894 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-177.ec2.internal_50b2c37a28961f1c8aacb6ad5db58d22/haproxy/0.log" Apr 17 16:58:48.147590 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:48.147479 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4mfhw_7923ca69-0185-41c6-84a2-64e801337d1d/kuadrant-console-plugin/0.log" Apr 17 16:58:48.231214 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:48.231186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-nw4zx_94060e29-bfc5-4851-8bd6-bc14a5f01acc/limitador/0.log" Apr 17 16:58:49.283282 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.283245 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/alertmanager/0.log" Apr 17 16:58:49.308039 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.308002 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/config-reloader/0.log" Apr 17 16:58:49.333199 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.333168 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/kube-rbac-proxy-web/0.log" Apr 17 16:58:49.357269 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.357240 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/kube-rbac-proxy/0.log" Apr 17 16:58:49.388348 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.388324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/kube-rbac-proxy-metric/0.log" Apr 17 16:58:49.412634 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.412591 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/prom-label-proxy/0.log" Apr 17 16:58:49.436219 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.436172 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0e7da448-f5ed-4b33-9c73-443939def85b/init-config-reloader/0.log" Apr 17 16:58:49.486425 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.486393 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-v7vr6_a280da6f-8899-47aa-ac6c-6b5ddcada842/cluster-monitoring-operator/0.log" Apr 17 16:58:49.754471 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.754439 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cc2ks_f55fedf9-4195-42a7-b7a3-7e91687196ed/node-exporter/0.log" Apr 17 16:58:49.780223 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.780198 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cc2ks_f55fedf9-4195-42a7-b7a3-7e91687196ed/kube-rbac-proxy/0.log" Apr 17 16:58:49.808354 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:49.808324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cc2ks_f55fedf9-4195-42a7-b7a3-7e91687196ed/init-textfile/0.log" Apr 17 16:58:50.219280 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:50.219244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jgxs5_f0a45005-1762-44a5-a570-e1300c5a70e4/prometheus-operator/0.log" Apr 17 16:58:50.263018 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:50.262972 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jgxs5_f0a45005-1762-44a5-a570-e1300c5a70e4/kube-rbac-proxy/0.log" Apr 17 16:58:50.330792 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:50.330757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fcbc8f5f4-cnkrt_22ec0cf7-7086-4276-80b3-231844c0a7a5/telemeter-client/0.log" Apr 17 16:58:50.358989 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:50.358895 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fcbc8f5f4-cnkrt_22ec0cf7-7086-4276-80b3-231844c0a7a5/reload/0.log" Apr 17 16:58:50.389961 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:50.389929 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fcbc8f5f4-cnkrt_22ec0cf7-7086-4276-80b3-231844c0a7a5/kube-rbac-proxy/0.log" Apr 17 16:58:52.271007 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.270976 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/1.log" Apr 17 16:58:52.281994 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.281959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-qghlf_4135c5b6-7f8a-4eaf-b551-405c8ab00981/console-operator/2.log" Apr 17 16:58:52.684862 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.684831 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm"] Apr 17 16:58:52.690148 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.690118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.697092 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.697066 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm"] Apr 17 16:58:52.749380 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.749331 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79d89b468f-t84r9_fc0c0d7a-e38d-4dac-a3cc-dbddcbbcfec8/console/0.log" Apr 17 16:58:52.790619 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.790590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-62lmr_9f5ed73a-d8e4-4937-b8df-b1d3c48f6efa/download-server/0.log" Apr 17 16:58:52.796645 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.796617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-podres\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.796776 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.796652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-sys\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.796776 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.796690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-lib-modules\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.796776 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.796712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5kp\" (UniqueName: \"kubernetes.io/projected/175c277d-47c0-4c0f-826f-b082ec624624-kube-api-access-wn5kp\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.796776 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.796756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-proc\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898157 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-podres\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-sys\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-lib-modules\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5kp\" (UniqueName: \"kubernetes.io/projected/175c277d-47c0-4c0f-826f-b082ec624624-kube-api-access-wn5kp\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-proc\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-sys\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-proc\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-podres\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.898366 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.898331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/175c277d-47c0-4c0f-826f-b082ec624624-lib-modules\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:52.907106 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:52.907083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5kp\" (UniqueName: \"kubernetes.io/projected/175c277d-47c0-4c0f-826f-b082ec624624-kube-api-access-wn5kp\") pod \"perf-node-gather-daemonset-w9dgm\" (UID: \"175c277d-47c0-4c0f-826f-b082ec624624\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:53.004928 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:53.004844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:53.150882 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:53.150851 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm"] Apr 17 16:58:53.151845 ip-10-0-131-177 kubenswrapper[2572]: W0417 16:58:53.151814 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod175c277d_47c0_4c0f_826f_b082ec624624.slice/crio-aecb5fd456c7213365b29a80df08c4c2d4915da3941cf20ce931ea76b252b545 WatchSource:0}: Error finding container aecb5fd456c7213365b29a80df08c4c2d4915da3941cf20ce931ea76b252b545: Status 404 returned error can't find the container with id aecb5fd456c7213365b29a80df08c4c2d4915da3941cf20ce931ea76b252b545 Apr 17 16:58:53.249094 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:53.249066 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-fpcqf_7ad42cb5-577d-4b4b-b374-97b0a9270a0e/volume-data-source-validator/0.log" Apr 17 16:58:53.552631 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:53.552597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" event={"ID":"175c277d-47c0-4c0f-826f-b082ec624624","Type":"ContainerStarted","Data":"b1ce923227f54532860ae086d7d97f8e691730d1bbd47612602cf73dc77c9ba8"} Apr 17 16:58:53.552631 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:53.552637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" event={"ID":"175c277d-47c0-4c0f-826f-b082ec624624","Type":"ContainerStarted","Data":"aecb5fd456c7213365b29a80df08c4c2d4915da3941cf20ce931ea76b252b545"} Apr 17 16:58:53.574157 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:53.571856 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" podStartSLOduration=1.571838739 podStartE2EDuration="1.571838739s" podCreationTimestamp="2026-04-17 16:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:58:53.570211564 +0000 UTC m=+1656.157800538" watchObservedRunningTime="2026-04-17 16:58:53.571838739 +0000 UTC m=+1656.159427711" Apr 17 16:58:54.029595 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:54.029569 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4nqbv_7d470685-9573-40d7-b32c-929ed88cc56d/dns/0.log" Apr 17 16:58:54.057426 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:54.057381 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4nqbv_7d470685-9573-40d7-b32c-929ed88cc56d/kube-rbac-proxy/0.log" Apr 17 16:58:54.313927 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:54.313807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5q2mv_8528bd48-4f37-4fef-bd6e-7df9d6a2773f/dns-node-resolver/0.log" Apr 17 16:58:54.558161 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:54.558124 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:58:54.897157 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:54.897123 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gmh4h_2685d399-ce45-4aec-bf5c-ce5d17cb16f4/node-ca/0.log" Apr 17 16:58:56.251344 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:56.251316 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gvlnm_d60e97bd-f20c-497d-ae2a-6dac86b93c77/serve-healthcheck-canary/0.log" Apr 17 16:58:56.751891 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:56.751837 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-l6mkp_cb478791-e3d5-4b73-803c-4c43377c9ebc/insights-operator/0.log" Apr 17 16:58:56.753258 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:56.753236 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-l6mkp_cb478791-e3d5-4b73-803c-4c43377c9ebc/insights-operator/1.log" Apr 17 16:58:56.773388 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:56.773368 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87jqc_017a011d-f629-47fd-9ac9-16112dbabf6a/kube-rbac-proxy/0.log" Apr 17 16:58:56.795521 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:56.795493 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87jqc_017a011d-f629-47fd-9ac9-16112dbabf6a/exporter/0.log" Apr 17 16:58:56.822505 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:56.822483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87jqc_017a011d-f629-47fd-9ac9-16112dbabf6a/extractor/0.log" Apr 17 16:58:59.584498 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:58:59.584442 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5f68f6fcb9-mtr9n_a8ddeb87-fbaf-4d48-84a3-44c8eaa95e63/manager/0.log" Apr 17 16:59:00.478566 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:00.478530 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-pkbzr_eba3d1d6-340f-40f0-a1fe-699dc39bfa1c/s3-init/0.log" Apr 17 16:59:00.576704 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:00.576669 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-w9dgm" Apr 17 16:59:05.255454 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:05.255423 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lbmtc_79cd9bd5-6c64-4090-a94b-e2339628ff70/migrator/0.log" Apr 17 16:59:05.281237 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:05.281214 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lbmtc_79cd9bd5-6c64-4090-a94b-e2339628ff70/graceful-termination/0.log" Apr 17 16:59:07.114398 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.114360 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/kube-multus-additional-cni-plugins/0.log" Apr 17 16:59:07.147195 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.147164 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/egress-router-binary-copy/0.log" Apr 17 16:59:07.169782 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.169754 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/cni-plugins/0.log" Apr 17 16:59:07.193728 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.193703 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/bond-cni-plugin/0.log" Apr 17 16:59:07.215144 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.215116 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/routeoverride-cni/0.log" Apr 17 16:59:07.238761 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.238738 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/whereabouts-cni-bincopy/0.log" Apr 17 16:59:07.262559 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.262537 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mnjjm_759006e8-ec0f-48e4-bf68-b87d7dcbf08e/whereabouts-cni/0.log" Apr 17 16:59:07.306937 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.306898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kxp4l_ff70e5e0-9bde-4a87-af27-6726427e4ba4/kube-multus/0.log" Apr 17 16:59:07.427363 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.427283 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vw79z_36f3412d-e266-4f24-8ea6-1f3d3cdd2546/network-metrics-daemon/0.log" Apr 17 16:59:07.447533 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:07.447498 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vw79z_36f3412d-e266-4f24-8ea6-1f3d3cdd2546/kube-rbac-proxy/0.log" Apr 17 16:59:08.854811 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.854779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-controller/0.log" Apr 17 16:59:08.873989 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.873961 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/0.log" Apr 17 16:59:08.888803 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.888779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovn-acl-logging/1.log" Apr 17 16:59:08.910197 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.910177 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/kube-rbac-proxy-node/0.log" Apr 17 16:59:08.936623 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.936601 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:59:08.956424 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.956389 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/northd/0.log" Apr 17 16:59:08.977821 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.977791 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/nbdb/0.log" Apr 17 16:59:08.999942 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:08.999913 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/sbdb/0.log" Apr 17 16:59:09.190412 ip-10-0-131-177 kubenswrapper[2572]: I0417 16:59:09.190324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vj4bz_f0075f81-88ff-4518-bd4e-bc50656a593b/ovnkube-controller/0.log"